var/home/core/zuul-output/0000755000175000017500000000000015067217654014542 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015067226342015500 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log0000644000000000000000004463601615067226333017716 0ustar rootrootOct 01 12:37:00 crc systemd[1]: Starting Kubernetes Kubelet... Oct 01 12:37:00 crc restorecon[4651]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:00 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:37:01 crc restorecon[4651]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 12:37:01 crc restorecon[4651]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 01 12:37:02 crc kubenswrapper[4727]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 12:37:02 crc kubenswrapper[4727]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 01 12:37:02 crc kubenswrapper[4727]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 12:37:02 crc kubenswrapper[4727]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 12:37:02 crc kubenswrapper[4727]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 01 12:37:02 crc kubenswrapper[4727]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.135885 4727 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.140836 4727 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.140862 4727 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.140870 4727 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.140877 4727 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.140883 4727 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.140889 4727 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.140895 4727 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.140900 4727 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.140905 4727 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.140910 4727 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.140915 4727 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.140919 4727 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.140924 4727 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.140928 4727 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.140933 4727 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.140938 4727 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.140942 4727 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.140956 4727 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.140962 4727 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.140967 4727 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.140973 4727 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.140979 4727 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.140985 4727 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.140990 4727 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141016 4727 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141022 4727 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141027 4727 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141032 4727 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141036 4727 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141040 4727 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141045 4727 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141051 4727 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141058 4727 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141063 4727 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141067 4727 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141072 4727 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141077 4727 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141082 4727 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141087 4727 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141093 4727 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141098 4727 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141103 4727 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141107 4727 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141112 4727 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141116 4727 feature_gate.go:330] unrecognized feature gate: Example Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141120 4727 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141124 4727 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141128 4727 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141133 4727 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141138 4727 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141142 4727 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141146 4727 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141150 4727 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141154 4727 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141159 4727 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141163 4727 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141167 4727 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141171 4727 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141175 4727 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141182 4727 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141187 4727 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141191 4727 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141195 4727 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141199 4727 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141203 4727 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141214 4727 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141219 4727 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141223 4727 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141227 4727 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141231 4727 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.141235 4727 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141343 4727 flags.go:64] FLAG: --address="0.0.0.0" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141357 4727 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141367 4727 flags.go:64] FLAG: --anonymous-auth="true" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141374 4727 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141381 4727 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141386 4727 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141393 4727 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141400 4727 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141405 4727 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141410 4727 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141415 4727 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141420 4727 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141425 4727 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141430 4727 flags.go:64] FLAG: --cgroup-root="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141435 4727 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141440 4727 flags.go:64] FLAG: --client-ca-file="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141445 4727 flags.go:64] FLAG: --cloud-config="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141450 4727 flags.go:64] FLAG: --cloud-provider="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141454 4727 flags.go:64] FLAG: --cluster-dns="[]" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141460 4727 flags.go:64] FLAG: --cluster-domain="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141465 4727 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141470 4727 flags.go:64] FLAG: --config-dir="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141475 4727 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141480 4727 flags.go:64] FLAG: --container-log-max-files="5" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141487 4727 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141492 4727 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141497 4727 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141502 4727 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141508 4727 flags.go:64] FLAG: --contention-profiling="false" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141512 4727 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141519 4727 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141524 4727 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141530 4727 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141537 4727 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141542 4727 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141547 4727 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141552 4727 flags.go:64] FLAG: --enable-load-reader="false" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141557 4727 flags.go:64] FLAG: --enable-server="true" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141562 4727 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141573 4727 flags.go:64] FLAG: --event-burst="100" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141578 4727 flags.go:64] FLAG: --event-qps="50" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141583 4727 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141588 4727 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141593 4727 flags.go:64] FLAG: --eviction-hard="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141599 4727 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141604 4727 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141609 4727 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141614 4727 flags.go:64] FLAG: --eviction-soft="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141618 4727 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141623 4727 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141628 4727 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141633 4727 flags.go:64] FLAG: --experimental-mounter-path="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141638 4727 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141643 4727 flags.go:64] FLAG: --fail-swap-on="true" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141648 4727 flags.go:64] FLAG: --feature-gates="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141654 4727 flags.go:64] FLAG: --file-check-frequency="20s" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141659 4727 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141664 4727 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141669 4727 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141674 4727 flags.go:64] FLAG: --healthz-port="10248" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141679 4727 flags.go:64] FLAG: --help="false" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141684 4727 flags.go:64] FLAG: --hostname-override="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141689 4727 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141694 4727 flags.go:64] FLAG: --http-check-frequency="20s" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141699 4727 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141705 4727 flags.go:64] FLAG: --image-credential-provider-config="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141711 4727 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141716 4727 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141721 4727 flags.go:64] FLAG: --image-service-endpoint="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141726 4727 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141731 4727 flags.go:64] FLAG: --kube-api-burst="100" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141735 4727 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141741 4727 flags.go:64] FLAG: --kube-api-qps="50" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141745 4727 flags.go:64] FLAG: --kube-reserved="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141750 4727 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141755 4727 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141760 4727 flags.go:64] FLAG: --kubelet-cgroups="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141764 4727 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141770 4727 flags.go:64] FLAG: --lock-file="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141775 4727 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141781 4727 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141785 4727 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141793 4727 flags.go:64] FLAG: --log-json-split-stream="false" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141798 4727 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141803 4727 flags.go:64] FLAG: --log-text-split-stream="false" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141808 4727 flags.go:64] FLAG: --logging-format="text" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141813 4727 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141819 4727 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141824 4727 flags.go:64] FLAG: --manifest-url="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141829 4727 flags.go:64] FLAG: --manifest-url-header="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141835 4727 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141840 4727 flags.go:64] FLAG: --max-open-files="1000000" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141846 4727 flags.go:64] FLAG: --max-pods="110" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141851 4727 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141886 4727 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141892 4727 flags.go:64] FLAG: --memory-manager-policy="None" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141898 4727 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141904 4727 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141910 4727 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141915 4727 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141928 4727 flags.go:64] FLAG: --node-status-max-images="50" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141933 4727 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141940 4727 flags.go:64] FLAG: --oom-score-adj="-999" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141945 4727 flags.go:64] FLAG: --pod-cidr="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141952 4727 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141961 4727 flags.go:64] FLAG: --pod-manifest-path="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141966 4727 flags.go:64] FLAG: --pod-max-pids="-1" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141970 4727 flags.go:64] FLAG: --pods-per-core="0" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141976 4727 flags.go:64] FLAG: --port="10250" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141980 4727 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141986 4727 flags.go:64] FLAG: --provider-id="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.141990 4727 flags.go:64] FLAG: --qos-reserved="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142014 4727 flags.go:64] FLAG: --read-only-port="10255" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142021 4727 flags.go:64] FLAG: --register-node="true" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142026 4727 flags.go:64] FLAG: --register-schedulable="true" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142030 4727 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142040 4727 flags.go:64] FLAG: --registry-burst="10" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142045 4727 flags.go:64] FLAG: --registry-qps="5" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142050 4727 flags.go:64] FLAG: --reserved-cpus="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142055 4727 flags.go:64] FLAG: --reserved-memory="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142061 4727 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142067 4727 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142072 4727 flags.go:64] FLAG: --rotate-certificates="false" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142078 4727 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142083 4727 flags.go:64] FLAG: --runonce="false" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142088 4727 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142092 4727 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142097 4727 flags.go:64] FLAG: --seccomp-default="false" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142103 4727 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142108 4727 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142115 4727 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142120 4727 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142126 4727 flags.go:64] FLAG: --storage-driver-password="root" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142131 4727 flags.go:64] FLAG: --storage-driver-secure="false" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142135 4727 flags.go:64] FLAG: --storage-driver-table="stats" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142141 4727 flags.go:64] FLAG: --storage-driver-user="root" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142146 4727 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142151 4727 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142157 4727 flags.go:64] FLAG: --system-cgroups="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142162 4727 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142170 4727 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142175 4727 flags.go:64] FLAG: --tls-cert-file="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142181 4727 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142187 4727 flags.go:64] FLAG: --tls-min-version="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142210 4727 flags.go:64] FLAG: --tls-private-key-file="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142216 4727 flags.go:64] FLAG: --topology-manager-policy="none" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142220 4727 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142226 4727 flags.go:64] FLAG: --topology-manager-scope="container" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142231 4727 flags.go:64] FLAG: --v="2" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142238 4727 flags.go:64] FLAG: --version="false" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142245 4727 flags.go:64] FLAG: --vmodule="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142251 4727 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142256 4727 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142390 4727 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142399 4727 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142405 4727 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142410 4727 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142416 4727 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142421 4727 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142425 4727 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142430 4727 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142434 4727 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142440 4727 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142445 4727 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142449 4727 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142456 4727 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142462 4727 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142466 4727 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142471 4727 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142476 4727 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142480 4727 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142484 4727 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142489 4727 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142493 4727 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142499 4727 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142504 4727 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142509 4727 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142514 4727 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142518 4727 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142522 4727 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142527 4727 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142531 4727 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142535 4727 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142540 4727 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142556 4727 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142562 4727 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142567 4727 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142572 4727 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142578 4727 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142583 4727 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142588 4727 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142593 4727 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142598 4727 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142602 4727 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142607 4727 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142612 4727 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142616 4727 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142622 4727 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142627 4727 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142632 4727 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142637 4727 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142642 4727 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142647 4727 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142652 4727 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142657 4727 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142661 4727 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142666 4727 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142672 4727 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142678 4727 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142683 4727 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142690 4727 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142695 4727 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142701 4727 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142706 4727 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142712 4727 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142717 4727 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142723 4727 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142729 4727 feature_gate.go:330] unrecognized feature gate: Example Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142734 4727 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142739 4727 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142744 4727 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142749 4727 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142754 4727 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.142760 4727 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.142777 4727 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.151253 4727 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.151286 4727 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151347 4727 feature_gate.go:330] unrecognized feature gate: Example Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151355 4727 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151360 4727 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151364 4727 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151368 4727 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151372 4727 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151375 4727 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151379 4727 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151382 4727 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151385 4727 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151389 4727 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151392 4727 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151396 4727 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151399 4727 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151404 4727 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151410 4727 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151414 4727 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151417 4727 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151421 4727 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151425 4727 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151429 4727 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151432 4727 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151436 4727 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151439 4727 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151444 4727 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151449 4727 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151453 4727 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151459 4727 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151465 4727 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151469 4727 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151473 4727 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151477 4727 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151482 4727 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151486 4727 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151490 4727 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151494 4727 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151498 4727 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151501 4727 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151505 4727 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151508 4727 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151512 4727 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151515 4727 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151519 4727 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151522 4727 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151525 4727 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151529 4727 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151533 4727 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151537 4727 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151541 4727 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151544 4727 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151548 4727 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151551 4727 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151555 4727 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151559 4727 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151563 4727 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151566 4727 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151570 4727 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151573 4727 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151577 4727 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151581 4727 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151584 4727 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151588 4727 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151591 4727 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151595 4727 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151599 4727 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151603 4727 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151606 4727 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151610 4727 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151614 4727 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151618 4727 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151623 4727 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.151629 4727 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151749 4727 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151756 4727 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151760 4727 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151764 4727 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151770 4727 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151774 4727 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151778 4727 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151783 4727 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151787 4727 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151797 4727 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151803 4727 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151808 4727 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151814 4727 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151821 4727 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151827 4727 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151832 4727 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151837 4727 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151841 4727 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151846 4727 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151851 4727 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151855 4727 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151860 4727 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151864 4727 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151868 4727 feature_gate.go:330] unrecognized feature gate: Example Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151873 4727 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151877 4727 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151883 4727 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151889 4727 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151895 4727 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151901 4727 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151906 4727 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151911 4727 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151915 4727 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151920 4727 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151926 4727 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151930 4727 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151936 4727 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151940 4727 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151944 4727 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151949 4727 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151953 4727 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151957 4727 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151961 4727 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151965 4727 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151969 4727 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151973 4727 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151977 4727 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151982 4727 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151986 4727 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.151991 4727 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.152012 4727 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.152017 4727 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.152021 4727 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.152025 4727 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.152030 4727 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.152033 4727 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.152039 4727 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.152043 4727 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.152048 4727 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.152052 4727 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.152056 4727 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.152061 4727 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.152065 4727 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.152069 4727 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.152073 4727 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.152077 4727 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.152083 4727 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.152088 4727 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.152093 4727 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.152098 4727 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.152104 4727 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.152111 4727 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.153559 4727 server.go:940] "Client rotation is on, will bootstrap in background" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.160109 4727 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.160215 4727 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.162316 4727 server.go:997] "Starting client certificate rotation" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.162353 4727 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.164698 4727 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-02 05:46:18.241248662 +0000 UTC Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.164826 4727 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1481h9m16.076425564s for next certificate rotation Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.191213 4727 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.196020 4727 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.216838 4727 log.go:25] "Validated CRI v1 runtime API" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.246054 4727 log.go:25] "Validated CRI v1 image API" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.248036 4727 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.255671 4727 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-01-12-32-27-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.255713 4727 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:41 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.272407 4727 manager.go:217] Machine: {Timestamp:2025-10-01 12:37:02.269463098 +0000 UTC m=+0.590817955 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799886 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:08ba6cbf-28d5-4f2d-86d9-787fd74364b2 BootID:5b442e64-06eb-4ef0-99a3-e242f42c1322 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:41 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:25:34:1c Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:25:34:1c Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:b7:58:c7 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:ea:ad:85 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:7e:4e:74 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ae:3c:6a Speed:-1 Mtu:1496} {Name:eth10 MacAddress:86:ac:b0:ab:bd:1a Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:76:e4:9e:35:b6:e6 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.272651 4727 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.272805 4727 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.274061 4727 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.274280 4727 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.274316 4727 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.275284 4727 topology_manager.go:138] "Creating topology manager with none policy" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.275309 4727 container_manager_linux.go:303] "Creating device plugin manager" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.275958 4727 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.275993 4727 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.276232 4727 state_mem.go:36] "Initialized new in-memory state store" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.276362 4727 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.280848 4727 kubelet.go:418] "Attempting to sync node with API server" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.280878 4727 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.280904 4727 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.280920 4727 kubelet.go:324] "Adding apiserver pod source" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.280944 4727 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.286283 4727 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.286971 4727 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.289433 4727 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.292873 4727 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Oct 01 12:37:02 crc kubenswrapper[4727]: E1001 12:37:02.292951 4727 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.293018 4727 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Oct 01 12:37:02 crc kubenswrapper[4727]: E1001 12:37:02.293097 4727 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.293815 4727 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.294540 4727 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.294577 4727 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.294592 4727 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.294615 4727 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.294632 4727 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.294647 4727 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.294670 4727 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.294686 4727 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.294700 4727 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.294743 4727 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.294759 4727 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.296803 4727 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.297677 4727 server.go:1280] "Started kubelet" Oct 01 12:37:02 crc systemd[1]: Started Kubernetes Kubelet. Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.301498 4727 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.302104 4727 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.302270 4727 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.302618 4727 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.302669 4727 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.302726 4727 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.302847 4727 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 12:27:41.874773049 +0000 UTC Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.302876 4727 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 2615h50m39.571898243s for next certificate rotation Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.303061 4727 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.303073 4727 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.303161 4727 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 01 12:37:02 crc kubenswrapper[4727]: E1001 12:37:02.303549 4727 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.304536 4727 factory.go:55] Registering systemd factory Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.304585 4727 factory.go:221] Registration of the systemd container factory successfully Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.304956 4727 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Oct 01 12:37:02 crc kubenswrapper[4727]: E1001 12:37:02.305041 4727 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.305397 4727 factory.go:153] Registering CRI-O factory Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.305443 4727 factory.go:221] Registration of the crio container factory successfully Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.305571 4727 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.305617 4727 factory.go:103] Registering Raw factory Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.305646 4727 manager.go:1196] Started watching for new ooms in manager Oct 01 12:37:02 crc kubenswrapper[4727]: E1001 12:37:02.306490 4727 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="200ms" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.306658 4727 manager.go:319] Starting recovery of all containers Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.310183 4727 server.go:460] "Adding debug handlers to kubelet server" Oct 01 12:37:02 crc kubenswrapper[4727]: E1001 12:37:02.309097 4727 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.69:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186a5e3d6980209d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-01 12:37:02.297620637 +0000 UTC m=+0.618975514,LastTimestamp:2025-10-01 12:37:02.297620637 +0000 UTC m=+0.618975514,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.315969 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.316074 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.316337 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.316356 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.316401 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.316417 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.316429 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.316441 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.316455 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.316468 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.316478 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.316488 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.316499 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.316513 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.316524 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.316532 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.316546 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.316555 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.316567 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.316577 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.316586 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.322516 4727 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.322617 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.322653 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.322681 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.322707 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.322731 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.322761 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.322789 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.322814 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.322837 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.322862 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.322885 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.322910 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.322935 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.322959 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323106 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323133 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323159 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323187 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323213 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323240 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323266 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323294 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323322 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323347 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323372 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323398 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323422 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323448 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323473 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323497 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323523 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323558 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323586 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323614 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323644 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323671 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323700 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323727 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323753 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323781 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323808 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323833 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323858 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323886 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323951 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.323979 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324048 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324077 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324103 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324132 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324160 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324187 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324215 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324239 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324268 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324293 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324318 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324345 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324372 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324396 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324421 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324445 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324471 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324494 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324519 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324545 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324569 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324594 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324619 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324643 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324671 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324695 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324718 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324742 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324767 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324790 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324815 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324843 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324867 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324892 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324916 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324948 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.324973 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325044 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325075 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325103 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325134 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325165 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325193 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325221 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325251 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325278 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325305 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325332 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325356 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325382 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325406 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325429 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325452 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325478 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325504 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325526 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325555 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325580 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325604 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325628 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325652 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325675 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325701 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325727 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325750 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325774 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325799 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325831 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325857 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325881 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325942 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.325966 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.328225 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.328310 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.328355 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.328409 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.328433 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.328457 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.328502 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.328544 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.328638 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.328692 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.328739 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.328785 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.328813 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.328890 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.328974 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.329097 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.329169 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.329220 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.329271 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.329290 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.329316 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.329364 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.329452 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.329498 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.329543 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.329590 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.329645 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.330278 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.330332 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.330361 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.330420 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.330449 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.330480 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.330510 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.330536 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.330575 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.330604 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.330630 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.330659 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.330687 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.330712 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.330738 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.330762 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.330786 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.330816 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.330842 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.330870 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.330896 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.330924 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.330949 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.330973 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.331034 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.331065 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.331091 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.331116 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.331143 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.331169 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.331193 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.331221 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.331243 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.331262 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.331281 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.331300 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.331320 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.331346 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.331363 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.331382 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.331401 4727 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.331424 4727 reconstruct.go:97] "Volume reconstruction finished" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.331441 4727 reconciler.go:26] "Reconciler: start to sync state" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.339636 4727 manager.go:324] Recovery completed Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.354509 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.356387 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.356428 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.356442 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.357599 4727 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.357790 4727 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.357974 4727 state_mem.go:36] "Initialized new in-memory state store" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.368785 4727 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.370987 4727 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.371064 4727 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.371109 4727 kubelet.go:2335] "Starting kubelet main sync loop" Oct 01 12:37:02 crc kubenswrapper[4727]: E1001 12:37:02.371175 4727 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.376511 4727 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Oct 01 12:37:02 crc kubenswrapper[4727]: E1001 12:37:02.376580 4727 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.386471 4727 policy_none.go:49] "None policy: Start" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.388230 4727 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.388302 4727 state_mem.go:35] "Initializing new in-memory state store" Oct 01 12:37:02 crc kubenswrapper[4727]: E1001 12:37:02.403814 4727 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.441979 4727 manager.go:334] "Starting Device Plugin manager" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.442097 4727 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.442119 4727 server.go:79] "Starting device plugin registration server" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.442754 4727 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.442792 4727 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.443302 4727 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.443450 4727 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.443464 4727 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 01 12:37:02 crc kubenswrapper[4727]: E1001 12:37:02.457277 4727 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.471923 4727 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.472034 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.473542 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.473605 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.473623 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.473838 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.474325 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.474362 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.475162 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.475246 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.475434 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.475459 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.475389 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.475560 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.475762 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.476041 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.476090 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.477202 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.477234 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.477250 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.477403 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.478283 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.478315 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.478325 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.478773 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.478962 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.479515 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.479557 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.479574 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.479703 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.479841 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.480026 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.483245 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.483286 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.483304 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.484827 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.484879 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.484892 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.485162 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.485221 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.485240 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.485183 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.485695 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.488023 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.488054 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.488065 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:02 crc kubenswrapper[4727]: E1001 12:37:02.508335 4727 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="400ms" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.533031 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.533068 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.533101 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.533142 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.533175 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.533191 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.533205 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.533224 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.533258 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.533290 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.533312 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.533351 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.533366 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.533381 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.533413 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.543315 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.544418 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.544451 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.544459 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.544481 4727 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 12:37:02 crc kubenswrapper[4727]: E1001 12:37:02.544988 4727 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.69:6443: connect: connection refused" node="crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.634907 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.635345 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.635377 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.635387 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.635149 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.635439 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.635394 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.635423 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.635495 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.635529 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.635565 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.635615 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.635657 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.635691 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.635730 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.635765 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.635728 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.635808 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.635851 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.635856 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.635873 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.635918 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.635939 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.635973 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.636010 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.636068 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.636078 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.636095 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.636123 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.636103 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.745796 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.747270 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.747312 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.747325 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.747348 4727 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 12:37:02 crc kubenswrapper[4727]: E1001 12:37:02.747857 4727 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.69:6443: connect: connection refused" node="crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.832891 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.841395 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.856583 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.874265 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: I1001 12:37:02.880576 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 12:37:02 crc kubenswrapper[4727]: E1001 12:37:02.909199 4727 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="800ms" Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.921613 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-6f88aefd226bf67a5c018b8c5aeb9a094ed66951e9746a1349c7f2d0bd0ebe2b WatchSource:0}: Error finding container 6f88aefd226bf67a5c018b8c5aeb9a094ed66951e9746a1349c7f2d0bd0ebe2b: Status 404 returned error can't find the container with id 6f88aefd226bf67a5c018b8c5aeb9a094ed66951e9746a1349c7f2d0bd0ebe2b Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.928977 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-88dd94978c3388f7505a83b8a765b2ac343b1be1a78232572333cd401a982af5 WatchSource:0}: Error finding container 88dd94978c3388f7505a83b8a765b2ac343b1be1a78232572333cd401a982af5: Status 404 returned error can't find the container with id 88dd94978c3388f7505a83b8a765b2ac343b1be1a78232572333cd401a982af5 Oct 01 12:37:02 crc kubenswrapper[4727]: W1001 12:37:02.941569 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-555c29ba451cc498f97db4de743950e81321587f5de80900be9ba7a3cbdce4cc WatchSource:0}: Error finding container 555c29ba451cc498f97db4de743950e81321587f5de80900be9ba7a3cbdce4cc: Status 404 returned error can't find the container with id 555c29ba451cc498f97db4de743950e81321587f5de80900be9ba7a3cbdce4cc Oct 01 12:37:03 crc kubenswrapper[4727]: I1001 12:37:03.148192 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:03 crc kubenswrapper[4727]: I1001 12:37:03.149631 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:03 crc kubenswrapper[4727]: I1001 12:37:03.149720 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:03 crc kubenswrapper[4727]: I1001 12:37:03.149820 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:03 crc kubenswrapper[4727]: I1001 12:37:03.149900 4727 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 12:37:03 crc kubenswrapper[4727]: E1001 12:37:03.151066 4727 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.69:6443: connect: connection refused" node="crc" Oct 01 12:37:03 crc kubenswrapper[4727]: I1001 12:37:03.303130 4727 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Oct 01 12:37:03 crc kubenswrapper[4727]: W1001 12:37:03.332818 4727 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Oct 01 12:37:03 crc kubenswrapper[4727]: E1001 12:37:03.332908 4727 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:37:03 crc kubenswrapper[4727]: W1001 12:37:03.356220 4727 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Oct 01 12:37:03 crc kubenswrapper[4727]: E1001 12:37:03.356330 4727 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:37:03 crc kubenswrapper[4727]: I1001 12:37:03.375597 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4d74df79ffd850ed6765f317e0622c535e769c8e7ed987050eda488b7f6d0d34"} Oct 01 12:37:03 crc kubenswrapper[4727]: I1001 12:37:03.377552 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6f88aefd226bf67a5c018b8c5aeb9a094ed66951e9746a1349c7f2d0bd0ebe2b"} Oct 01 12:37:03 crc kubenswrapper[4727]: I1001 12:37:03.380919 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"caeda5b53f96470660c7c0d966d91ecad6d5c59d8835121068fea5d652564fd7"} Oct 01 12:37:03 crc kubenswrapper[4727]: I1001 12:37:03.382523 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"555c29ba451cc498f97db4de743950e81321587f5de80900be9ba7a3cbdce4cc"} Oct 01 12:37:03 crc kubenswrapper[4727]: I1001 12:37:03.384111 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"88dd94978c3388f7505a83b8a765b2ac343b1be1a78232572333cd401a982af5"} Oct 01 12:37:03 crc kubenswrapper[4727]: W1001 12:37:03.393607 4727 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Oct 01 12:37:03 crc kubenswrapper[4727]: E1001 12:37:03.393679 4727 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:37:03 crc kubenswrapper[4727]: W1001 12:37:03.528523 4727 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Oct 01 12:37:03 crc kubenswrapper[4727]: E1001 12:37:03.528617 4727 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:37:03 crc kubenswrapper[4727]: E1001 12:37:03.710361 4727 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="1.6s" Oct 01 12:37:03 crc kubenswrapper[4727]: I1001 12:37:03.966925 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:03 crc kubenswrapper[4727]: I1001 12:37:03.968350 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:03 crc kubenswrapper[4727]: I1001 12:37:03.968396 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:03 crc kubenswrapper[4727]: I1001 12:37:03.968407 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:03 crc kubenswrapper[4727]: I1001 12:37:03.968432 4727 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 12:37:03 crc kubenswrapper[4727]: E1001 12:37:03.968709 4727 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.69:6443: connect: connection refused" node="crc" Oct 01 12:37:04 crc kubenswrapper[4727]: I1001 12:37:04.303331 4727 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Oct 01 12:37:04 crc kubenswrapper[4727]: I1001 12:37:04.390424 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6"} Oct 01 12:37:04 crc kubenswrapper[4727]: I1001 12:37:04.393098 4727 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="ee5e6294d2eede6127e8a96a2918086c184a155b5f5495ba2e3dfa517bf6ea2c" exitCode=0 Oct 01 12:37:04 crc kubenswrapper[4727]: I1001 12:37:04.393156 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"ee5e6294d2eede6127e8a96a2918086c184a155b5f5495ba2e3dfa517bf6ea2c"} Oct 01 12:37:04 crc kubenswrapper[4727]: I1001 12:37:04.393230 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:04 crc kubenswrapper[4727]: I1001 12:37:04.395058 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:04 crc kubenswrapper[4727]: I1001 12:37:04.395105 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:04 crc kubenswrapper[4727]: I1001 12:37:04.395120 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:04 crc kubenswrapper[4727]: I1001 12:37:04.396079 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc"} Oct 01 12:37:04 crc kubenswrapper[4727]: I1001 12:37:04.396310 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:04 crc kubenswrapper[4727]: I1001 12:37:04.397759 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:04 crc kubenswrapper[4727]: I1001 12:37:04.397821 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:04 crc kubenswrapper[4727]: I1001 12:37:04.397845 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:04 crc kubenswrapper[4727]: I1001 12:37:04.398503 4727 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="5801d95e434a56f2ba9e6f26b212681adb8be1b6b4d046992ccb6461edd5434c" exitCode=0 Oct 01 12:37:04 crc kubenswrapper[4727]: I1001 12:37:04.398625 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:04 crc kubenswrapper[4727]: I1001 12:37:04.398697 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"5801d95e434a56f2ba9e6f26b212681adb8be1b6b4d046992ccb6461edd5434c"} Oct 01 12:37:04 crc kubenswrapper[4727]: I1001 12:37:04.399679 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:04 crc kubenswrapper[4727]: I1001 12:37:04.399717 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:04 crc kubenswrapper[4727]: I1001 12:37:04.399735 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:04 crc kubenswrapper[4727]: I1001 12:37:04.400969 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318"} Oct 01 12:37:04 crc kubenswrapper[4727]: I1001 12:37:04.401173 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:04 crc kubenswrapper[4727]: I1001 12:37:04.402753 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:04 crc kubenswrapper[4727]: I1001 12:37:04.402797 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:04 crc kubenswrapper[4727]: I1001 12:37:04.402809 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.302801 4727 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Oct 01 12:37:05 crc kubenswrapper[4727]: E1001 12:37:05.312329 4727 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="3.2s" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.410339 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"40a5eee022677df9faef1fa90bae6dd0987ead513c125425b2aab5c5e635e47e"} Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.410401 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0ac7ffe118814edb3f763dce5c8d5adee0faab3a74f38abb06f39d0ffb91dea2"} Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.410412 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f792b3289e210881d451962f8c2fd7f66ba8e01540309210e4286af5c14056c8"} Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.410475 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.412012 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.412059 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.412074 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.413737 4727 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318" exitCode=0 Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.413843 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318"} Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.413869 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.415158 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.415234 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.415248 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.420308 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512"} Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.420357 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056"} Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.420377 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e"} Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.420405 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.421492 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.421532 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.421547 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.422618 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.423293 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"8e6befe9632634578dd92528ccd7d40d3b512da621b0ae607e372340ac66c740"} Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.423792 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.423828 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.423843 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.424961 4727 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc" exitCode=0 Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.425021 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc"} Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.425377 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.426761 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.426804 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.426817 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.431185 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.432154 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.432196 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.432209 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.569687 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.571273 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.571321 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.571331 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:05 crc kubenswrapper[4727]: I1001 12:37:05.571371 4727 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 12:37:05 crc kubenswrapper[4727]: E1001 12:37:05.572541 4727 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.69:6443: connect: connection refused" node="crc" Oct 01 12:37:05 crc kubenswrapper[4727]: W1001 12:37:05.622132 4727 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Oct 01 12:37:05 crc kubenswrapper[4727]: E1001 12:37:05.622253 4727 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:37:05 crc kubenswrapper[4727]: W1001 12:37:05.728733 4727 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Oct 01 12:37:05 crc kubenswrapper[4727]: E1001 12:37:05.728840 4727 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:37:06 crc kubenswrapper[4727]: W1001 12:37:06.128567 4727 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Oct 01 12:37:06 crc kubenswrapper[4727]: E1001 12:37:06.128679 4727 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:37:06 crc kubenswrapper[4727]: W1001 12:37:06.173908 4727 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Oct 01 12:37:06 crc kubenswrapper[4727]: E1001 12:37:06.174028 4727 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Oct 01 12:37:06 crc kubenswrapper[4727]: I1001 12:37:06.302910 4727 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Oct 01 12:37:06 crc kubenswrapper[4727]: I1001 12:37:06.431260 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430"} Oct 01 12:37:06 crc kubenswrapper[4727]: I1001 12:37:06.431316 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23"} Oct 01 12:37:06 crc kubenswrapper[4727]: I1001 12:37:06.431330 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f"} Oct 01 12:37:06 crc kubenswrapper[4727]: I1001 12:37:06.431341 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8"} Oct 01 12:37:06 crc kubenswrapper[4727]: I1001 12:37:06.436963 4727 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f" exitCode=0 Oct 01 12:37:06 crc kubenswrapper[4727]: I1001 12:37:06.437136 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:06 crc kubenswrapper[4727]: I1001 12:37:06.437164 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f"} Oct 01 12:37:06 crc kubenswrapper[4727]: I1001 12:37:06.437286 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:06 crc kubenswrapper[4727]: I1001 12:37:06.437318 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:06 crc kubenswrapper[4727]: I1001 12:37:06.437352 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:37:06 crc kubenswrapper[4727]: I1001 12:37:06.437318 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:06 crc kubenswrapper[4727]: I1001 12:37:06.438715 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:06 crc kubenswrapper[4727]: I1001 12:37:06.438745 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:06 crc kubenswrapper[4727]: I1001 12:37:06.438750 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:06 crc kubenswrapper[4727]: I1001 12:37:06.438763 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:06 crc kubenswrapper[4727]: I1001 12:37:06.438775 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:06 crc kubenswrapper[4727]: I1001 12:37:06.438790 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:06 crc kubenswrapper[4727]: I1001 12:37:06.438755 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:06 crc kubenswrapper[4727]: I1001 12:37:06.438790 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:06 crc kubenswrapper[4727]: I1001 12:37:06.438967 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:06 crc kubenswrapper[4727]: I1001 12:37:06.439565 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:06 crc kubenswrapper[4727]: I1001 12:37:06.439654 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:06 crc kubenswrapper[4727]: I1001 12:37:06.439667 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:07 crc kubenswrapper[4727]: I1001 12:37:07.442908 4727 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d" exitCode=0 Oct 01 12:37:07 crc kubenswrapper[4727]: I1001 12:37:07.442988 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d"} Oct 01 12:37:07 crc kubenswrapper[4727]: I1001 12:37:07.443120 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:07 crc kubenswrapper[4727]: I1001 12:37:07.444547 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:07 crc kubenswrapper[4727]: I1001 12:37:07.444577 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:07 crc kubenswrapper[4727]: I1001 12:37:07.444588 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:07 crc kubenswrapper[4727]: I1001 12:37:07.448972 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f"} Oct 01 12:37:07 crc kubenswrapper[4727]: I1001 12:37:07.449113 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:07 crc kubenswrapper[4727]: I1001 12:37:07.449219 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:07 crc kubenswrapper[4727]: I1001 12:37:07.450143 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:07 crc kubenswrapper[4727]: I1001 12:37:07.450204 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:07 crc kubenswrapper[4727]: I1001 12:37:07.450228 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:07 crc kubenswrapper[4727]: I1001 12:37:07.452177 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:07 crc kubenswrapper[4727]: I1001 12:37:07.452222 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:07 crc kubenswrapper[4727]: I1001 12:37:07.452238 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:07 crc kubenswrapper[4727]: I1001 12:37:07.640336 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:07 crc kubenswrapper[4727]: I1001 12:37:07.640535 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:07 crc kubenswrapper[4727]: I1001 12:37:07.641875 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:07 crc kubenswrapper[4727]: I1001 12:37:07.641919 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:07 crc kubenswrapper[4727]: I1001 12:37:07.641933 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:08 crc kubenswrapper[4727]: I1001 12:37:08.456745 4727 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 12:37:08 crc kubenswrapper[4727]: I1001 12:37:08.456811 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:08 crc kubenswrapper[4727]: I1001 12:37:08.457156 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083"} Oct 01 12:37:08 crc kubenswrapper[4727]: I1001 12:37:08.457197 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7"} Oct 01 12:37:08 crc kubenswrapper[4727]: I1001 12:37:08.457209 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8"} Oct 01 12:37:08 crc kubenswrapper[4727]: I1001 12:37:08.457219 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66"} Oct 01 12:37:08 crc kubenswrapper[4727]: I1001 12:37:08.459735 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:08 crc kubenswrapper[4727]: I1001 12:37:08.459757 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:08 crc kubenswrapper[4727]: I1001 12:37:08.459766 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:08 crc kubenswrapper[4727]: I1001 12:37:08.773260 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:08 crc kubenswrapper[4727]: I1001 12:37:08.774817 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:08 crc kubenswrapper[4727]: I1001 12:37:08.774859 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:08 crc kubenswrapper[4727]: I1001 12:37:08.774871 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:08 crc kubenswrapper[4727]: I1001 12:37:08.774894 4727 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 12:37:09 crc kubenswrapper[4727]: I1001 12:37:09.187864 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:09 crc kubenswrapper[4727]: I1001 12:37:09.464698 4727 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 12:37:09 crc kubenswrapper[4727]: I1001 12:37:09.464752 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:09 crc kubenswrapper[4727]: I1001 12:37:09.464738 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16"} Oct 01 12:37:09 crc kubenswrapper[4727]: I1001 12:37:09.464763 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:09 crc kubenswrapper[4727]: I1001 12:37:09.465844 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:09 crc kubenswrapper[4727]: I1001 12:37:09.465900 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:09 crc kubenswrapper[4727]: I1001 12:37:09.465917 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:09 crc kubenswrapper[4727]: I1001 12:37:09.466451 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:09 crc kubenswrapper[4727]: I1001 12:37:09.466513 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:09 crc kubenswrapper[4727]: I1001 12:37:09.466535 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:10 crc kubenswrapper[4727]: I1001 12:37:10.467465 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:10 crc kubenswrapper[4727]: I1001 12:37:10.468713 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:10 crc kubenswrapper[4727]: I1001 12:37:10.468789 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:10 crc kubenswrapper[4727]: I1001 12:37:10.468812 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:10 crc kubenswrapper[4727]: I1001 12:37:10.506968 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:10 crc kubenswrapper[4727]: I1001 12:37:10.507267 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:10 crc kubenswrapper[4727]: I1001 12:37:10.508912 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:10 crc kubenswrapper[4727]: I1001 12:37:10.508965 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:10 crc kubenswrapper[4727]: I1001 12:37:10.508980 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:11 crc kubenswrapper[4727]: I1001 12:37:11.751591 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:11 crc kubenswrapper[4727]: I1001 12:37:11.751899 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:11 crc kubenswrapper[4727]: I1001 12:37:11.753466 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:11 crc kubenswrapper[4727]: I1001 12:37:11.753516 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:11 crc kubenswrapper[4727]: I1001 12:37:11.753538 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:12 crc kubenswrapper[4727]: E1001 12:37:12.457429 4727 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 01 12:37:12 crc kubenswrapper[4727]: I1001 12:37:12.626148 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 01 12:37:12 crc kubenswrapper[4727]: I1001 12:37:12.626486 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:12 crc kubenswrapper[4727]: I1001 12:37:12.628589 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:12 crc kubenswrapper[4727]: I1001 12:37:12.628646 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:12 crc kubenswrapper[4727]: I1001 12:37:12.628664 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:13 crc kubenswrapper[4727]: I1001 12:37:13.967320 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:13 crc kubenswrapper[4727]: I1001 12:37:13.967582 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:13 crc kubenswrapper[4727]: I1001 12:37:13.969321 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:13 crc kubenswrapper[4727]: I1001 12:37:13.969390 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:13 crc kubenswrapper[4727]: I1001 12:37:13.969407 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:14 crc kubenswrapper[4727]: I1001 12:37:14.308560 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:14 crc kubenswrapper[4727]: I1001 12:37:14.480345 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:14 crc kubenswrapper[4727]: I1001 12:37:14.482156 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:14 crc kubenswrapper[4727]: I1001 12:37:14.482207 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:14 crc kubenswrapper[4727]: I1001 12:37:14.482219 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:14 crc kubenswrapper[4727]: I1001 12:37:14.891404 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 01 12:37:14 crc kubenswrapper[4727]: I1001 12:37:14.891767 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:14 crc kubenswrapper[4727]: I1001 12:37:14.893727 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:14 crc kubenswrapper[4727]: I1001 12:37:14.893781 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:14 crc kubenswrapper[4727]: I1001 12:37:14.893793 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:14 crc kubenswrapper[4727]: I1001 12:37:14.903802 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:14 crc kubenswrapper[4727]: I1001 12:37:14.909897 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:15 crc kubenswrapper[4727]: I1001 12:37:15.484522 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:15 crc kubenswrapper[4727]: I1001 12:37:15.486191 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:15 crc kubenswrapper[4727]: I1001 12:37:15.486288 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:15 crc kubenswrapper[4727]: I1001 12:37:15.486307 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:15 crc kubenswrapper[4727]: I1001 12:37:15.489684 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:16 crc kubenswrapper[4727]: I1001 12:37:16.487123 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:16 crc kubenswrapper[4727]: I1001 12:37:16.488584 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:16 crc kubenswrapper[4727]: I1001 12:37:16.488622 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:16 crc kubenswrapper[4727]: I1001 12:37:16.488636 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:16 crc kubenswrapper[4727]: I1001 12:37:16.968106 4727 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 01 12:37:16 crc kubenswrapper[4727]: I1001 12:37:16.968220 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 01 12:37:17 crc kubenswrapper[4727]: I1001 12:37:17.304317 4727 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 01 12:37:17 crc kubenswrapper[4727]: I1001 12:37:17.489154 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:17 crc kubenswrapper[4727]: I1001 12:37:17.490218 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:17 crc kubenswrapper[4727]: I1001 12:37:17.490281 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:17 crc kubenswrapper[4727]: I1001 12:37:17.490295 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:18 crc kubenswrapper[4727]: I1001 12:37:18.147618 4727 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 01 12:37:18 crc kubenswrapper[4727]: I1001 12:37:18.147698 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 01 12:37:18 crc kubenswrapper[4727]: I1001 12:37:18.152023 4727 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 01 12:37:18 crc kubenswrapper[4727]: I1001 12:37:18.152056 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 01 12:37:19 crc kubenswrapper[4727]: I1001 12:37:19.201471 4727 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]log ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]etcd ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/generic-apiserver-start-informers ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/priority-and-fairness-filter ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/start-apiextensions-informers ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/start-apiextensions-controllers ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/crd-informer-synced ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/start-system-namespaces-controller ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 01 12:37:19 crc kubenswrapper[4727]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/bootstrap-controller ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/start-kube-aggregator-informers ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/apiservice-registration-controller ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/apiservice-discovery-controller ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]autoregister-completion ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/apiservice-openapi-controller ok Oct 01 12:37:19 crc kubenswrapper[4727]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 01 12:37:19 crc kubenswrapper[4727]: livez check failed Oct 01 12:37:19 crc kubenswrapper[4727]: I1001 12:37:19.201560 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:37:22 crc kubenswrapper[4727]: E1001 12:37:22.458533 4727 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.144149 4727 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.147704 4727 trace.go:236] Trace[1204414834]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 12:37:11.000) (total time: 12147ms): Oct 01 12:37:23 crc kubenswrapper[4727]: Trace[1204414834]: ---"Objects listed" error: 12147ms (12:37:23.147) Oct 01 12:37:23 crc kubenswrapper[4727]: Trace[1204414834]: [12.147086456s] [12.147086456s] END Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.147748 4727 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.149317 4727 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.149603 4727 trace.go:236] Trace[1336009289]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 12:37:10.413) (total time: 12735ms): Oct 01 12:37:23 crc kubenswrapper[4727]: Trace[1336009289]: ---"Objects listed" error: 12735ms (12:37:23.149) Oct 01 12:37:23 crc kubenswrapper[4727]: Trace[1336009289]: [12.735396115s] [12.735396115s] END Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.149615 4727 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.149620 4727 trace.go:236] Trace[1483462613]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 12:37:10.772) (total time: 12376ms): Oct 01 12:37:23 crc kubenswrapper[4727]: Trace[1483462613]: ---"Objects listed" error: 12376ms (12:37:23.149) Oct 01 12:37:23 crc kubenswrapper[4727]: Trace[1483462613]: [12.376518723s] [12.376518723s] END Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.149650 4727 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.149760 4727 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.150384 4727 trace.go:236] Trace[271683457]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 12:37:10.771) (total time: 12379ms): Oct 01 12:37:23 crc kubenswrapper[4727]: Trace[271683457]: ---"Objects listed" error: 12379ms (12:37:23.150) Oct 01 12:37:23 crc kubenswrapper[4727]: Trace[271683457]: [12.379208982s] [12.379208982s] END Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.150409 4727 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.185446 4727 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:40970->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.185537 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:40970->192.168.126.11:17697: read: connection reset by peer" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.185446 4727 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:40964->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.185605 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:40964->192.168.126.11:17697: read: connection reset by peer" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.294024 4727 apiserver.go:52] "Watching apiserver" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.297372 4727 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.297567 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.297937 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.298014 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.298033 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.298090 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.298133 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.298182 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.298254 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.298336 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.298367 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.300344 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.300358 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.300565 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.301683 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.301810 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.302076 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.302235 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.302391 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.303640 4727 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.307578 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.324867 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.346516 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.350614 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.350671 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.350698 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.350721 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.350747 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.350770 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.350791 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.350812 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.350837 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.350859 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.350883 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.350904 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.350925 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.350947 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.350971 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.350992 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351049 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351108 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351135 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351160 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351184 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351207 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351232 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351268 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351292 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351316 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351340 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351380 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351403 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351424 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351514 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351581 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351604 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351627 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351653 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351677 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351702 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351723 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351743 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351764 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351784 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351804 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351827 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351847 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351866 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351889 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351910 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351939 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351962 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351982 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.352030 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.352053 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351043 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.352077 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351048 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351261 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351284 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.352488 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.352504 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351436 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351688 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351769 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351855 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.351949 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.352038 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.352044 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.352545 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.352734 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.352764 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.352768 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.352769 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.352803 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.352899 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.352943 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.352967 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.352989 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.353206 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.353231 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.353348 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.353421 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.353441 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.353585 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.353644 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.353838 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.353872 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.353985 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.354133 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.354160 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.354280 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.355855 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.355870 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.356860 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.356868 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.357048 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.357170 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.357297 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:37:23.857267846 +0000 UTC m=+22.178622683 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.357637 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.357732 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.357881 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.357967 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.358081 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.358223 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.358320 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.358665 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.358769 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.358855 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.358952 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.359084 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.359213 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.359297 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.359445 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.359575 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.359662 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.359751 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.359872 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.359957 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.360448 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.360710 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.360749 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.360783 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.360812 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.360846 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.360878 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.360913 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.361239 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.361286 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.361445 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.361656 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.357623 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.357702 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.357912 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.357936 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.358337 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.358391 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.358403 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.358672 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.358731 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.358948 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.358961 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.359123 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.359317 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.359339 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.359450 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.359628 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.359695 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.359690 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.359880 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.360082 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.360199 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.360211 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.360375 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.360679 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.360682 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.360831 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.360884 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.361908 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.360895 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.361132 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.361214 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.361392 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.361667 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.361687 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.362159 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.362202 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.362245 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.362281 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.356244 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.362406 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.362427 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.362630 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.362659 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.362764 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.362976 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.363020 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.363033 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.363388 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.363405 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.363409 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.363757 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.363805 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.363841 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.363877 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.363910 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.363940 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.364111 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.364138 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.364247 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.364147 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.364361 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.364500 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.364645 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.364728 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.364727 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.364784 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.364849 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.364914 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.364965 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.365053 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.365092 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.365110 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.365151 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.365211 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.365236 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.365243 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.365263 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.365286 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.365331 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.365377 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.365425 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.365465 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.365515 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.365558 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.365601 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.365625 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.365642 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.365678 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.365704 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.365737 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.365755 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.365794 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.365840 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.365932 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.366141 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.367019 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.366336 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.366444 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.366742 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.367397 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.367689 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.367791 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.369422 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.365879 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.369716 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.369756 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.369780 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.369804 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.369896 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.369922 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.370401 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.370977 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.371100 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.372942 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.373419 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.373682 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.373700 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.373738 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.373761 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.373786 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.373807 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.373828 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.373897 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.373913 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.373931 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.373956 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.373974 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374007 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374025 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374044 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374063 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374082 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374100 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374120 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374138 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374159 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374177 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374195 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374214 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374232 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374250 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374270 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374290 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374309 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374333 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374356 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374375 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374392 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374412 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374431 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374446 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374462 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374479 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374496 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374516 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374534 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374554 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374586 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374602 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374619 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374636 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374653 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374672 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374688 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374706 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374726 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374743 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374760 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374777 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374795 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374818 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374837 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374861 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374884 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374909 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374934 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375021 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375064 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375123 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375148 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375176 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375205 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375236 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375264 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375292 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375328 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375357 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375385 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375415 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375439 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375548 4727 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375561 4727 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375573 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375586 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375599 4727 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375609 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375620 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375629 4727 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375638 4727 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375647 4727 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375658 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375670 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375681 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375691 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375700 4727 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375709 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375720 4727 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375730 4727 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375739 4727 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375750 4727 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375761 4727 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375770 4727 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375779 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375789 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375800 4727 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375809 4727 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375820 4727 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375829 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375840 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375850 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375860 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375869 4727 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375878 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375887 4727 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375897 4727 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375907 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375916 4727 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375926 4727 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375936 4727 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375946 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375955 4727 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375965 4727 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375974 4727 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375983 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375993 4727 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376018 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376028 4727 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376038 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376048 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376059 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376068 4727 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376076 4727 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376086 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376095 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376105 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376113 4727 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376123 4727 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376132 4727 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376140 4727 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376149 4727 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376159 4727 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376168 4727 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376178 4727 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376187 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376196 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376205 4727 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376215 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376225 4727 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376235 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376244 4727 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376253 4727 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376262 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376272 4727 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376281 4727 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376290 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376299 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376309 4727 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376318 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376327 4727 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376336 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376346 4727 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376355 4727 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376365 4727 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376375 4727 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376386 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376397 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376417 4727 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376431 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376443 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376455 4727 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376465 4727 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376474 4727 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376483 4727 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376494 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376510 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376522 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376533 4727 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376544 4727 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376559 4727 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376573 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376586 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376598 4727 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376610 4727 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376630 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376643 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376655 4727 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376666 4727 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376677 4727 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374039 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374230 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374288 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374499 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374592 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374734 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.374951 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375048 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375180 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375772 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.375954 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376270 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376338 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.376706 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.377762 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.377897 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.378322 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.378172 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.378687 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.378761 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.378867 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.379023 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.379779 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.380295 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.380642 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.380669 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.380747 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.380871 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.380952 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.381124 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.381248 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.381284 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.381558 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.381641 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.382195 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.382232 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.382327 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.382627 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.382757 4727 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.382910 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.383203 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.383223 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.383783 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.383954 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.383979 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.383991 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.384363 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.383261 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.384304 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.384821 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.384833 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.385197 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.385489 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.385717 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.385784 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.386089 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.386430 4727 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.386530 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:23.886502997 +0000 UTC m=+22.207858014 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.386550 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.386928 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.387625 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.387637 4727 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.387700 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:23.887684672 +0000 UTC m=+22.209039509 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.387749 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.387775 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.387960 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.388066 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.388304 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.392056 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.394214 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.399300 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.400361 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.400387 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.401709 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.401756 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.401772 4727 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.401872 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:23.901842749 +0000 UTC m=+22.223197586 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.402474 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.403037 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.403174 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.403061 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.403520 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.403895 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.403927 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.403948 4727 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.404037 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:23.904014714 +0000 UTC m=+22.225369561 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.404546 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.404607 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.405890 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.405930 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.407879 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.408688 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.409429 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.409427 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.410371 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.410574 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.411097 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.415737 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.415758 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.417537 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.418096 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.418417 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.418438 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.418982 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.419861 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.421474 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.421481 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.426516 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.428992 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.437541 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.461312 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.477769 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.477848 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.477937 4727 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.477949 4727 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.477959 4727 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.477969 4727 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.477978 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.477988 4727 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.478013 4727 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.478022 4727 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.478031 4727 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.478041 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.478051 4727 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.478059 4727 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.478070 4727 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.478108 4727 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.478745 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.478121 4727 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479240 4727 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479223 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479253 4727 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479346 4727 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479363 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479377 4727 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479390 4727 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479404 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479424 4727 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479436 4727 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479449 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479461 4727 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479477 4727 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479493 4727 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479506 4727 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479520 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479535 4727 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479551 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479566 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479579 4727 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479592 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479607 4727 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479622 4727 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479635 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479648 4727 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479660 4727 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479671 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479682 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479695 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479706 4727 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479718 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479730 4727 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479742 4727 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479761 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479775 4727 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479788 4727 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479801 4727 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479814 4727 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479827 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479841 4727 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479854 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479867 4727 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479880 4727 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479897 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479909 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479922 4727 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479934 4727 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479947 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479960 4727 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479972 4727 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.479985 4727 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.480013 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.480027 4727 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.480040 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.480051 4727 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.480064 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.480075 4727 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.480087 4727 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.480100 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.480114 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.480125 4727 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.480137 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.480149 4727 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.480163 4727 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.480178 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.480192 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.480203 4727 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.480215 4727 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.480226 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.480239 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.480252 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.480264 4727 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.498851 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.506519 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.509148 4727 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f" exitCode=255 Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.509267 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f"} Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.524411 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.526273 4727 scope.go:117] "RemoveContainer" containerID="9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.528277 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.539928 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.553015 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.567235 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.580172 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.589382 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.606519 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.616408 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.624044 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.630155 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.632786 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 12:37:23 crc kubenswrapper[4727]: W1001 12:37:23.637502 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-7f92fb8c4936243d553873cfda341012d7bc0980fbc8ff0da458f6d54848683e WatchSource:0}: Error finding container 7f92fb8c4936243d553873cfda341012d7bc0980fbc8ff0da458f6d54848683e: Status 404 returned error can't find the container with id 7f92fb8c4936243d553873cfda341012d7bc0980fbc8ff0da458f6d54848683e Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.643567 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:23 crc kubenswrapper[4727]: W1001 12:37:23.652249 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-0913c6d4d9b2b39270cfc09dde571c475c3bee859abacaac36b7a4cab808972b WatchSource:0}: Error finding container 0913c6d4d9b2b39270cfc09dde571c475c3bee859abacaac36b7a4cab808972b: Status 404 returned error can't find the container with id 0913c6d4d9b2b39270cfc09dde571c475c3bee859abacaac36b7a4cab808972b Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.657236 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.882252 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.882537 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:37:24.882475813 +0000 UTC m=+23.203830660 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.973193 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.977527 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.982975 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.983029 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.983056 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:23 crc kubenswrapper[4727]: I1001 12:37:23.983085 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.983144 4727 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.983169 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.983185 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.983223 4727 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.983247 4727 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.983253 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:24.983230849 +0000 UTC m=+23.304585716 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.983325 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:24.983305802 +0000 UTC m=+23.304660649 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.983345 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:24.983334262 +0000 UTC m=+23.304689109 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.983368 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.983416 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.983431 4727 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:23 crc kubenswrapper[4727]: E1001 12:37:23.983520 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:24.983490307 +0000 UTC m=+23.304845144 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:23.999960 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.000568 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.014721 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.028858 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.048596 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.087665 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.118099 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.142501 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.154829 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.170528 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.182145 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.191793 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.192224 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.206089 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.216213 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.229672 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.243235 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.254702 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.270785 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.281215 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.291512 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.303792 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.316608 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.329178 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.340548 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.372161 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:24 crc kubenswrapper[4727]: E1001 12:37:24.372615 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.380846 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.381882 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.384784 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.385675 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.387569 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.388307 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.390933 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.391729 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.393120 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.394039 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.394748 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.396244 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.396859 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.398190 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.401071 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.401798 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.402771 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.403936 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.405786 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.409528 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.410292 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.412848 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.413500 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.414982 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.415599 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.416288 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.417968 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.418630 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.422109 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.422747 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.424743 4727 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.424885 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.427093 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.428564 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.429136 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.431730 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.432673 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.433948 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.434734 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.439253 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.439971 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.441352 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.442827 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.443672 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.444921 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.445675 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.446721 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.447800 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.449052 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.449677 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.450332 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.451471 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.452320 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.453449 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.513766 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a"} Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.513890 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"7d13c65b09ba0b9c493983f1807db8a8c7a05d5c1b3e1538852930521284e7b0"} Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.515704 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541"} Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.515894 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d"} Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.515979 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0913c6d4d9b2b39270cfc09dde571c475c3bee859abacaac36b7a4cab808972b"} Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.516684 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7f92fb8c4936243d553873cfda341012d7bc0980fbc8ff0da458f6d54848683e"} Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.519440 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.522867 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742"} Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.522919 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.527738 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:24 crc kubenswrapper[4727]: E1001 12:37:24.528532 4727 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.532658 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.545764 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.559664 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.571775 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.585232 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.601155 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.624484 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.641164 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.658186 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.690650 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.714662 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.720858 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-fjlgl"] Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.721174 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fjlgl" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.722588 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.725476 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.725791 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.728017 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.739965 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.755935 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.768583 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.780916 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.795019 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.808729 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.822058 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.832201 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.843424 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.853379 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.860544 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.867816 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.878492 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.890806 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.890875 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/972e1ff9-8a88-471a-b5e6-73f16af6df57-hosts-file\") pod \"node-resolver-fjlgl\" (UID: \"972e1ff9-8a88-471a-b5e6-73f16af6df57\") " pod="openshift-dns/node-resolver-fjlgl" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.890916 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqctk\" (UniqueName: \"kubernetes.io/projected/972e1ff9-8a88-471a-b5e6-73f16af6df57-kube-api-access-rqctk\") pod \"node-resolver-fjlgl\" (UID: \"972e1ff9-8a88-471a-b5e6-73f16af6df57\") " pod="openshift-dns/node-resolver-fjlgl" Oct 01 12:37:24 crc kubenswrapper[4727]: E1001 12:37:24.891049 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:37:26.891035267 +0000 UTC m=+25.212390104 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.930757 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.943784 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.947604 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:24Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.960739 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:24Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.980433 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:24Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.982907 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.991577 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/972e1ff9-8a88-471a-b5e6-73f16af6df57-hosts-file\") pod \"node-resolver-fjlgl\" (UID: \"972e1ff9-8a88-471a-b5e6-73f16af6df57\") " pod="openshift-dns/node-resolver-fjlgl" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.991630 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqctk\" (UniqueName: \"kubernetes.io/projected/972e1ff9-8a88-471a-b5e6-73f16af6df57-kube-api-access-rqctk\") pod \"node-resolver-fjlgl\" (UID: \"972e1ff9-8a88-471a-b5e6-73f16af6df57\") " pod="openshift-dns/node-resolver-fjlgl" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.991658 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.991681 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.991710 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.991733 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:37:24 crc kubenswrapper[4727]: I1001 12:37:24.991753 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/972e1ff9-8a88-471a-b5e6-73f16af6df57-hosts-file\") pod \"node-resolver-fjlgl\" (UID: \"972e1ff9-8a88-471a-b5e6-73f16af6df57\") " pod="openshift-dns/node-resolver-fjlgl" Oct 01 12:37:24 crc kubenswrapper[4727]: E1001 12:37:24.991839 4727 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:37:24 crc kubenswrapper[4727]: E1001 12:37:24.991895 4727 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:37:24 crc kubenswrapper[4727]: E1001 12:37:24.991963 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:26.991922366 +0000 UTC m=+25.313277273 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:37:24 crc kubenswrapper[4727]: E1001 12:37:24.991843 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:37:24 crc kubenswrapper[4727]: E1001 12:37:24.992046 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:26.992032259 +0000 UTC m=+25.313387146 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:37:24 crc kubenswrapper[4727]: E1001 12:37:24.992058 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:37:24 crc kubenswrapper[4727]: E1001 12:37:24.992081 4727 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:24 crc kubenswrapper[4727]: E1001 12:37:24.991848 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:37:24 crc kubenswrapper[4727]: E1001 12:37:24.992132 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:37:24 crc kubenswrapper[4727]: E1001 12:37:24.992147 4727 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:24 crc kubenswrapper[4727]: E1001 12:37:24.992155 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:26.992130692 +0000 UTC m=+25.313485559 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:24 crc kubenswrapper[4727]: E1001 12:37:24.992182 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:26.992171254 +0000 UTC m=+25.313526161 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.005113 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:24Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.022420 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqctk\" (UniqueName: \"kubernetes.io/projected/972e1ff9-8a88-471a-b5e6-73f16af6df57-kube-api-access-rqctk\") pod \"node-resolver-fjlgl\" (UID: \"972e1ff9-8a88-471a-b5e6-73f16af6df57\") " pod="openshift-dns/node-resolver-fjlgl" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.027024 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.034584 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fjlgl" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.044959 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.062482 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.078610 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.097067 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.115041 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.132986 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.146334 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.162964 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.178136 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.195115 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.207419 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.222227 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.243150 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.258939 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.371596 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.371663 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:37:25 crc kubenswrapper[4727]: E1001 12:37:25.371758 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:37:25 crc kubenswrapper[4727]: E1001 12:37:25.371845 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.481864 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-c7jw9"] Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.482331 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-slqxs"] Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.482545 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.482626 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.485910 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.485948 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.486229 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.486549 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.487697 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.487849 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.487989 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.488194 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.488380 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.488888 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.521395 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.526706 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fjlgl" event={"ID":"972e1ff9-8a88-471a-b5e6-73f16af6df57","Type":"ContainerStarted","Data":"f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db"} Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.526807 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fjlgl" event={"ID":"972e1ff9-8a88-471a-b5e6-73f16af6df57","Type":"ContainerStarted","Data":"46b3e6d4f826a81542bf198e5a2019c453296f30de3a67839b64b978f6611b03"} Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.558233 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.589727 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.598237 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-os-release\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.598329 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-multus-cni-dir\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.598357 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-cnibin\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.598380 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-hostroot\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.598408 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-host-var-lib-kubelet\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.598447 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56tnh\" (UniqueName: \"kubernetes.io/projected/d18290ae-64a5-44a5-a704-90977d85852b-kube-api-access-56tnh\") pod \"machine-config-daemon-c7jw9\" (UID: \"d18290ae-64a5-44a5-a704-90977d85852b\") " pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.598477 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-etc-kubernetes\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.598639 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d18290ae-64a5-44a5-a704-90977d85852b-proxy-tls\") pod \"machine-config-daemon-c7jw9\" (UID: \"d18290ae-64a5-44a5-a704-90977d85852b\") " pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.598703 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-multus-conf-dir\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.598736 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5cf1a0b8-9119-44c6-91ea-473317335fb9-cni-binary-copy\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.598753 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-host-var-lib-cni-bin\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.598774 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5cf1a0b8-9119-44c6-91ea-473317335fb9-multus-daemon-config\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.598835 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d18290ae-64a5-44a5-a704-90977d85852b-mcd-auth-proxy-config\") pod \"machine-config-daemon-c7jw9\" (UID: \"d18290ae-64a5-44a5-a704-90977d85852b\") " pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.598863 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-multus-socket-dir-parent\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.598882 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-host-run-multus-certs\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.598906 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d18290ae-64a5-44a5-a704-90977d85852b-rootfs\") pod \"machine-config-daemon-c7jw9\" (UID: \"d18290ae-64a5-44a5-a704-90977d85852b\") " pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.598924 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-host-run-k8s-cni-cncf-io\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.598941 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc8jc\" (UniqueName: \"kubernetes.io/projected/5cf1a0b8-9119-44c6-91ea-473317335fb9-kube-api-access-dc8jc\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.598968 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-system-cni-dir\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.599007 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-host-run-netns\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.599028 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-host-var-lib-cni-multus\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.619657 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.653802 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.684118 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.700499 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5cf1a0b8-9119-44c6-91ea-473317335fb9-cni-binary-copy\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.700566 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-host-var-lib-cni-bin\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.700589 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5cf1a0b8-9119-44c6-91ea-473317335fb9-multus-daemon-config\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.700616 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d18290ae-64a5-44a5-a704-90977d85852b-mcd-auth-proxy-config\") pod \"machine-config-daemon-c7jw9\" (UID: \"d18290ae-64a5-44a5-a704-90977d85852b\") " pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.700643 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-multus-socket-dir-parent\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.700664 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-host-run-multus-certs\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.700701 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d18290ae-64a5-44a5-a704-90977d85852b-rootfs\") pod \"machine-config-daemon-c7jw9\" (UID: \"d18290ae-64a5-44a5-a704-90977d85852b\") " pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.700722 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-host-run-k8s-cni-cncf-io\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.700746 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc8jc\" (UniqueName: \"kubernetes.io/projected/5cf1a0b8-9119-44c6-91ea-473317335fb9-kube-api-access-dc8jc\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.700779 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-system-cni-dir\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.700799 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-host-run-netns\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.700823 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-host-var-lib-cni-multus\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.700847 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-os-release\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.700884 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-multus-socket-dir-parent\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.700918 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-host-var-lib-cni-bin\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.700910 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-multus-cni-dir\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.700984 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-host-run-netns\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.701046 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-host-var-lib-cni-multus\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.701071 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-cnibin\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.701191 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-host-var-lib-kubelet\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.701223 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-hostroot\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.701227 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-system-cni-dir\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.701254 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-cnibin\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.701270 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56tnh\" (UniqueName: \"kubernetes.io/projected/d18290ae-64a5-44a5-a704-90977d85852b-kube-api-access-56tnh\") pod \"machine-config-daemon-c7jw9\" (UID: \"d18290ae-64a5-44a5-a704-90977d85852b\") " pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.701294 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-etc-kubernetes\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.701322 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-host-run-multus-certs\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.701337 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-multus-conf-dir\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.701359 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d18290ae-64a5-44a5-a704-90977d85852b-proxy-tls\") pod \"machine-config-daemon-c7jw9\" (UID: \"d18290ae-64a5-44a5-a704-90977d85852b\") " pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.701369 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d18290ae-64a5-44a5-a704-90977d85852b-rootfs\") pod \"machine-config-daemon-c7jw9\" (UID: \"d18290ae-64a5-44a5-a704-90977d85852b\") " pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.701469 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-multus-conf-dir\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.701506 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-host-var-lib-kubelet\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.701512 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-hostroot\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.701471 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-etc-kubernetes\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.701469 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-host-run-k8s-cni-cncf-io\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.701587 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-os-release\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.702352 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5cf1a0b8-9119-44c6-91ea-473317335fb9-multus-cni-dir\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.718345 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.727234 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d18290ae-64a5-44a5-a704-90977d85852b-mcd-auth-proxy-config\") pod \"machine-config-daemon-c7jw9\" (UID: \"d18290ae-64a5-44a5-a704-90977d85852b\") " pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.728713 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5cf1a0b8-9119-44c6-91ea-473317335fb9-cni-binary-copy\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.744424 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.751124 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5cf1a0b8-9119-44c6-91ea-473317335fb9-multus-daemon-config\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.756248 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56tnh\" (UniqueName: \"kubernetes.io/projected/d18290ae-64a5-44a5-a704-90977d85852b-kube-api-access-56tnh\") pod \"machine-config-daemon-c7jw9\" (UID: \"d18290ae-64a5-44a5-a704-90977d85852b\") " pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.759566 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d18290ae-64a5-44a5-a704-90977d85852b-proxy-tls\") pod \"machine-config-daemon-c7jw9\" (UID: \"d18290ae-64a5-44a5-a704-90977d85852b\") " pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.772830 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc8jc\" (UniqueName: \"kubernetes.io/projected/5cf1a0b8-9119-44c6-91ea-473317335fb9-kube-api-access-dc8jc\") pod \"multus-slqxs\" (UID: \"5cf1a0b8-9119-44c6-91ea-473317335fb9\") " pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.775838 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.797878 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-slqxs" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.806311 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.809449 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: W1001 12:37:25.821070 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd18290ae_64a5_44a5_a704_90977d85852b.slice/crio-c132d77a3852cc1d43fd2a5c5a58754be2d0c2685d868305c589c7cde9ccccb4 WatchSource:0}: Error finding container c132d77a3852cc1d43fd2a5c5a58754be2d0c2685d868305c589c7cde9ccccb4: Status 404 returned error can't find the container with id c132d77a3852cc1d43fd2a5c5a58754be2d0c2685d868305c589c7cde9ccccb4 Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.855595 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.904186 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.929749 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.942543 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pwx55"] Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.943930 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-nfgjl"] Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.944165 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.944969 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.971448 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:25Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.980233 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 01 12:37:25 crc kubenswrapper[4727]: I1001 12:37:25.996232 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.017353 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.036630 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.056734 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.077616 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.096506 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.107311 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txq6l\" (UniqueName: \"kubernetes.io/projected/a908511b-2ce2-4a11-8dad-3867bee13f57-kube-api-access-txq6l\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.107360 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-node-log\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.107385 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-etc-openvswitch\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.107410 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-run-openvswitch\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.107446 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.107603 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee-cni-binary-copy\") pod \"multus-additional-cni-plugins-nfgjl\" (UID: \"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\") " pod="openshift-multus/multus-additional-cni-plugins-nfgjl" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.107712 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-log-socket\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.107745 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh2x2\" (UniqueName: \"kubernetes.io/projected/1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee-kube-api-access-nh2x2\") pod \"multus-additional-cni-plugins-nfgjl\" (UID: \"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\") " pod="openshift-multus/multus-additional-cni-plugins-nfgjl" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.107782 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-run-netns\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.107802 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee-os-release\") pod \"multus-additional-cni-plugins-nfgjl\" (UID: \"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\") " pod="openshift-multus/multus-additional-cni-plugins-nfgjl" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.107818 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-cni-netd\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.107834 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-run-ovn-kubernetes\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.107849 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a908511b-2ce2-4a11-8dad-3867bee13f57-ovnkube-config\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.107867 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a908511b-2ce2-4a11-8dad-3867bee13f57-ovn-node-metrics-cert\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.107884 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-run-systemd\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.107900 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-run-ovn\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.107914 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nfgjl\" (UID: \"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\") " pod="openshift-multus/multus-additional-cni-plugins-nfgjl" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.107929 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a908511b-2ce2-4a11-8dad-3867bee13f57-env-overrides\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.107954 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-slash\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.107968 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-cni-bin\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.108036 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee-system-cni-dir\") pod \"multus-additional-cni-plugins-nfgjl\" (UID: \"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\") " pod="openshift-multus/multus-additional-cni-plugins-nfgjl" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.108064 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nfgjl\" (UID: \"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\") " pod="openshift-multus/multus-additional-cni-plugins-nfgjl" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.108088 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-kubelet\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.108111 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-var-lib-openvswitch\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.108127 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee-cnibin\") pod \"multus-additional-cni-plugins-nfgjl\" (UID: \"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\") " pod="openshift-multus/multus-additional-cni-plugins-nfgjl" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.108144 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-systemd-units\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.108161 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a908511b-2ce2-4a11-8dad-3867bee13f57-ovnkube-script-lib\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.117342 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.136779 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.186105 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:26Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.208792 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txq6l\" (UniqueName: \"kubernetes.io/projected/a908511b-2ce2-4a11-8dad-3867bee13f57-kube-api-access-txq6l\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.208842 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-node-log\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.208881 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-etc-openvswitch\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.208905 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-run-openvswitch\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.208929 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.208955 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee-cni-binary-copy\") pod \"multus-additional-cni-plugins-nfgjl\" (UID: \"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\") " pod="openshift-multus/multus-additional-cni-plugins-nfgjl" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.208987 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-log-socket\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.209023 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh2x2\" (UniqueName: \"kubernetes.io/projected/1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee-kube-api-access-nh2x2\") pod \"multus-additional-cni-plugins-nfgjl\" (UID: \"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\") " pod="openshift-multus/multus-additional-cni-plugins-nfgjl" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.209048 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-run-netns\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.209063 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee-os-release\") pod \"multus-additional-cni-plugins-nfgjl\" (UID: \"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\") " pod="openshift-multus/multus-additional-cni-plugins-nfgjl" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.209076 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-cni-netd\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.209091 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-run-ovn-kubernetes\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.209120 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a908511b-2ce2-4a11-8dad-3867bee13f57-ovnkube-config\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.209152 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a908511b-2ce2-4a11-8dad-3867bee13f57-ovn-node-metrics-cert\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.209177 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-run-systemd\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.209202 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-run-ovn\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.209221 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nfgjl\" (UID: \"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\") " pod="openshift-multus/multus-additional-cni-plugins-nfgjl" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.209231 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-node-log\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.209460 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-run-netns\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.209715 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee-os-release\") pod \"multus-additional-cni-plugins-nfgjl\" (UID: \"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\") " pod="openshift-multus/multus-additional-cni-plugins-nfgjl" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.209243 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a908511b-2ce2-4a11-8dad-3867bee13f57-env-overrides\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.209190 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-log-socket\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.209800 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-cni-netd\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.209855 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.209877 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-run-ovn-kubernetes\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.209881 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-run-systemd\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.209904 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-etc-openvswitch\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.209924 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-run-openvswitch\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.209937 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a908511b-2ce2-4a11-8dad-3867bee13f57-env-overrides\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.209991 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-run-ovn\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.210269 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-slash\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.210299 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-cni-bin\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.210376 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-cni-bin\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.210394 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-slash\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.210463 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee-system-cni-dir\") pod \"multus-additional-cni-plugins-nfgjl\" (UID: \"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\") " pod="openshift-multus/multus-additional-cni-plugins-nfgjl" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.210493 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee-system-cni-dir\") pod \"multus-additional-cni-plugins-nfgjl\" (UID: \"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\") " pod="openshift-multus/multus-additional-cni-plugins-nfgjl" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.210506 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a908511b-2ce2-4a11-8dad-3867bee13f57-ovnkube-config\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.210544 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nfgjl\" (UID: \"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\") " pod="openshift-multus/multus-additional-cni-plugins-nfgjl" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.210613 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee-cnibin\") pod \"multus-additional-cni-plugins-nfgjl\" (UID: \"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\") " pod="openshift-multus/multus-additional-cni-plugins-nfgjl" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.210625 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nfgjl\" (UID: \"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\") " pod="openshift-multus/multus-additional-cni-plugins-nfgjl" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.210651 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee-cnibin\") pod \"multus-additional-cni-plugins-nfgjl\" (UID: \"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\") " pod="openshift-multus/multus-additional-cni-plugins-nfgjl" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.210691 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-kubelet\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.210722 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-kubelet\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.210746 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-var-lib-openvswitch\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.210776 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-var-lib-openvswitch\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.210803 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-systemd-units\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.210822 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a908511b-2ce2-4a11-8dad-3867bee13f57-ovnkube-script-lib\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.210963 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-systemd-units\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.211283 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nfgjl\" (UID: \"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\") " pod="openshift-multus/multus-additional-cni-plugins-nfgjl" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.211779 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee-cni-binary-copy\") pod \"multus-additional-cni-plugins-nfgjl\" (UID: \"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\") " pod="openshift-multus/multus-additional-cni-plugins-nfgjl" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.212060 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a908511b-2ce2-4a11-8dad-3867bee13f57-ovnkube-script-lib\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.214759 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a908511b-2ce2-4a11-8dad-3867bee13f57-ovn-node-metrics-cert\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.225845 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:26Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.252735 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txq6l\" (UniqueName: \"kubernetes.io/projected/a908511b-2ce2-4a11-8dad-3867bee13f57-kube-api-access-txq6l\") pod \"ovnkube-node-pwx55\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.270836 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh2x2\" (UniqueName: \"kubernetes.io/projected/1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee-kube-api-access-nh2x2\") pod \"multus-additional-cni-plugins-nfgjl\" (UID: \"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\") " pod="openshift-multus/multus-additional-cni-plugins-nfgjl" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.279058 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.285519 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" Oct 01 12:37:26 crc kubenswrapper[4727]: W1001 12:37:26.294789 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda908511b_2ce2_4a11_8dad_3867bee13f57.slice/crio-51cd98c693b08619591dd5b354ddc00a92e7e447846a509d65c77f8dbb77dad3 WatchSource:0}: Error finding container 51cd98c693b08619591dd5b354ddc00a92e7e447846a509d65c77f8dbb77dad3: Status 404 returned error can't find the container with id 51cd98c693b08619591dd5b354ddc00a92e7e447846a509d65c77f8dbb77dad3 Oct 01 12:37:26 crc kubenswrapper[4727]: W1001 12:37:26.303848 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1062bbb4_dd72_4659_91b3_2aa9f1b6a1ee.slice/crio-76ce548af315240f51b31c647c9ac45480f8a01ec47deb08f1b4214b20820581 WatchSource:0}: Error finding container 76ce548af315240f51b31c647c9ac45480f8a01ec47deb08f1b4214b20820581: Status 404 returned error can't find the container with id 76ce548af315240f51b31c647c9ac45480f8a01ec47deb08f1b4214b20820581 Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.308047 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:26Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.358690 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:26Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.372253 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:26 crc kubenswrapper[4727]: E1001 12:37:26.372395 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.389660 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:26Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.431303 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:26Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.470925 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:26Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.508338 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:26Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.532058 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf"} Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.533699 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" event={"ID":"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee","Type":"ContainerStarted","Data":"76ce548af315240f51b31c647c9ac45480f8a01ec47deb08f1b4214b20820581"} Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.535120 4727 generic.go:334] "Generic (PLEG): container finished" podID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerID="d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6" exitCode=0 Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.535195 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" event={"ID":"a908511b-2ce2-4a11-8dad-3867bee13f57","Type":"ContainerDied","Data":"d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6"} Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.535224 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" event={"ID":"a908511b-2ce2-4a11-8dad-3867bee13f57","Type":"ContainerStarted","Data":"51cd98c693b08619591dd5b354ddc00a92e7e447846a509d65c77f8dbb77dad3"} Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.537272 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" event={"ID":"d18290ae-64a5-44a5-a704-90977d85852b","Type":"ContainerStarted","Data":"570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead"} Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.537295 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" event={"ID":"d18290ae-64a5-44a5-a704-90977d85852b","Type":"ContainerStarted","Data":"d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca"} Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.537307 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" event={"ID":"d18290ae-64a5-44a5-a704-90977d85852b","Type":"ContainerStarted","Data":"c132d77a3852cc1d43fd2a5c5a58754be2d0c2685d868305c589c7cde9ccccb4"} Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.551276 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-slqxs" event={"ID":"5cf1a0b8-9119-44c6-91ea-473317335fb9","Type":"ContainerStarted","Data":"4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9"} Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.551315 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-slqxs" event={"ID":"5cf1a0b8-9119-44c6-91ea-473317335fb9","Type":"ContainerStarted","Data":"72825c60ac6152e0f94d9756e0ae8d6da133bb728c189c2563ed44b7b44011f0"} Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.566883 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:26Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.638634 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:26Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.662031 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:26Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.679305 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:26Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.710117 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:26Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.747228 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:26Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.794489 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:26Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.826109 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:26Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.869087 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:26Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.906050 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:26Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.915820 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:37:26 crc kubenswrapper[4727]: E1001 12:37:26.915963 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:37:30.915943387 +0000 UTC m=+29.237298234 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.955467 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:26Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:26 crc kubenswrapper[4727]: I1001 12:37:26.988342 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:26Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.016480 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.016535 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.016575 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.016604 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:27 crc kubenswrapper[4727]: E1001 12:37:27.016699 4727 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:37:27 crc kubenswrapper[4727]: E1001 12:37:27.016706 4727 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:37:27 crc kubenswrapper[4727]: E1001 12:37:27.016759 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:31.016741145 +0000 UTC m=+29.338095982 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:37:27 crc kubenswrapper[4727]: E1001 12:37:27.016802 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:31.016781206 +0000 UTC m=+29.338136143 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:37:27 crc kubenswrapper[4727]: E1001 12:37:27.016890 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:37:27 crc kubenswrapper[4727]: E1001 12:37:27.016908 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:37:27 crc kubenswrapper[4727]: E1001 12:37:27.016923 4727 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:27 crc kubenswrapper[4727]: E1001 12:37:27.016958 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:31.016947541 +0000 UTC m=+29.338302488 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:27 crc kubenswrapper[4727]: E1001 12:37:27.017044 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:37:27 crc kubenswrapper[4727]: E1001 12:37:27.017058 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:37:27 crc kubenswrapper[4727]: E1001 12:37:27.017067 4727 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:27 crc kubenswrapper[4727]: E1001 12:37:27.017094 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:31.017085155 +0000 UTC m=+29.338440102 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.033580 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.065299 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.105165 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.144623 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.372292 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.372480 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:37:27 crc kubenswrapper[4727]: E1001 12:37:27.372533 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:37:27 crc kubenswrapper[4727]: E1001 12:37:27.372739 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.554951 4727 generic.go:334] "Generic (PLEG): container finished" podID="1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee" containerID="78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9" exitCode=0 Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.555046 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" event={"ID":"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee","Type":"ContainerDied","Data":"78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9"} Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.562105 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" event={"ID":"a908511b-2ce2-4a11-8dad-3867bee13f57","Type":"ContainerStarted","Data":"e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5"} Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.562148 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" event={"ID":"a908511b-2ce2-4a11-8dad-3867bee13f57","Type":"ContainerStarted","Data":"d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60"} Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.562167 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" event={"ID":"a908511b-2ce2-4a11-8dad-3867bee13f57","Type":"ContainerStarted","Data":"1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2"} Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.562181 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" event={"ID":"a908511b-2ce2-4a11-8dad-3867bee13f57","Type":"ContainerStarted","Data":"86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d"} Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.562190 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" event={"ID":"a908511b-2ce2-4a11-8dad-3867bee13f57","Type":"ContainerStarted","Data":"3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d"} Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.571542 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.585956 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.608874 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.620864 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.632280 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.649026 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.665663 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.690409 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.703179 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.721562 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.738336 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.765224 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.782103 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:27 crc kubenswrapper[4727]: I1001 12:37:27.801049 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:27Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:28 crc kubenswrapper[4727]: I1001 12:37:28.371767 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:28 crc kubenswrapper[4727]: E1001 12:37:28.371930 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:37:28 crc kubenswrapper[4727]: I1001 12:37:28.568441 4727 generic.go:334] "Generic (PLEG): container finished" podID="1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee" containerID="91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e" exitCode=0 Oct 01 12:37:28 crc kubenswrapper[4727]: I1001 12:37:28.568516 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" event={"ID":"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee","Type":"ContainerDied","Data":"91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e"} Oct 01 12:37:28 crc kubenswrapper[4727]: I1001 12:37:28.576256 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" event={"ID":"a908511b-2ce2-4a11-8dad-3867bee13f57","Type":"ContainerStarted","Data":"3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8"} Oct 01 12:37:28 crc kubenswrapper[4727]: I1001 12:37:28.584698 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:28Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:28 crc kubenswrapper[4727]: I1001 12:37:28.601937 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:28Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:28 crc kubenswrapper[4727]: I1001 12:37:28.622180 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:28Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:28 crc kubenswrapper[4727]: I1001 12:37:28.640067 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:28Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:28 crc kubenswrapper[4727]: I1001 12:37:28.655410 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:28Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:28 crc kubenswrapper[4727]: I1001 12:37:28.670160 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:28Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:28 crc kubenswrapper[4727]: I1001 12:37:28.688605 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:28Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:28 crc kubenswrapper[4727]: I1001 12:37:28.710610 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:28Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:28 crc kubenswrapper[4727]: I1001 12:37:28.726338 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:28Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:28 crc kubenswrapper[4727]: I1001 12:37:28.741200 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:28Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:28 crc kubenswrapper[4727]: I1001 12:37:28.757583 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:28Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:28 crc kubenswrapper[4727]: I1001 12:37:28.777992 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:28Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:28 crc kubenswrapper[4727]: I1001 12:37:28.793761 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:28Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:28 crc kubenswrapper[4727]: I1001 12:37:28.809280 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:28Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.372245 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.372297 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:37:29 crc kubenswrapper[4727]: E1001 12:37:29.372437 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:37:29 crc kubenswrapper[4727]: E1001 12:37:29.372667 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.549610 4727 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.552630 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.552693 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.552712 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.552922 4727 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.562447 4727 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.562937 4727 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.565034 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.565096 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.565109 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.565129 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.565148 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:29Z","lastTransitionTime":"2025-10-01T12:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.583878 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" event={"ID":"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee","Type":"ContainerDied","Data":"8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966"} Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.583592 4727 generic.go:334] "Generic (PLEG): container finished" podID="1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee" containerID="8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966" exitCode=0 Oct 01 12:37:29 crc kubenswrapper[4727]: E1001 12:37:29.588380 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:29Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.601800 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.601877 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.601893 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.601918 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.601933 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:29Z","lastTransitionTime":"2025-10-01T12:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.602270 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:29Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:29 crc kubenswrapper[4727]: E1001 12:37:29.618274 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:29Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.618616 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:29Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.624854 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.625147 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.625245 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.625343 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.625424 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:29Z","lastTransitionTime":"2025-10-01T12:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.635312 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:29Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:29 crc kubenswrapper[4727]: E1001 12:37:29.642825 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:29Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.648610 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.648677 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.648696 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.648722 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.648740 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:29Z","lastTransitionTime":"2025-10-01T12:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.655954 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:29Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:29 crc kubenswrapper[4727]: E1001 12:37:29.662923 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:29Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.667405 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.667453 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.667466 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.667487 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.667499 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:29Z","lastTransitionTime":"2025-10-01T12:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.679121 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:29Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:29 crc kubenswrapper[4727]: E1001 12:37:29.680252 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:29Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:29 crc kubenswrapper[4727]: E1001 12:37:29.680406 4727 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.682875 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.682937 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.682950 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.682972 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.682987 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:29Z","lastTransitionTime":"2025-10-01T12:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.695494 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:29Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.711919 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:29Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.732978 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:29Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.750746 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:29Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.766619 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:29Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.782769 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:29Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.786823 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.786896 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.786911 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.786936 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.786949 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:29Z","lastTransitionTime":"2025-10-01T12:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.808794 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:29Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.824311 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:29Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.839566 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:29Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.890737 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.890782 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.890794 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.890816 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.890828 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:29Z","lastTransitionTime":"2025-10-01T12:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.993966 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.994045 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.994063 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.994085 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:29 crc kubenswrapper[4727]: I1001 12:37:29.994100 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:29Z","lastTransitionTime":"2025-10-01T12:37:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.098592 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.098636 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.098646 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.098660 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.098670 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:30Z","lastTransitionTime":"2025-10-01T12:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.111355 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-b9wkt"] Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.111907 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-b9wkt" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.115333 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.115428 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.116511 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.117082 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.139308 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.168142 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.186260 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.202521 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.202577 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.202592 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.202610 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.202623 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:30Z","lastTransitionTime":"2025-10-01T12:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.204867 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.219487 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.236526 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.252065 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/10dcb95f-031f-4e4c-bf15-0c8e1b53674a-serviceca\") pod \"node-ca-b9wkt\" (UID: \"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\") " pod="openshift-image-registry/node-ca-b9wkt" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.252120 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml7nw\" (UniqueName: \"kubernetes.io/projected/10dcb95f-031f-4e4c-bf15-0c8e1b53674a-kube-api-access-ml7nw\") pod \"node-ca-b9wkt\" (UID: \"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\") " pod="openshift-image-registry/node-ca-b9wkt" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.252154 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10dcb95f-031f-4e4c-bf15-0c8e1b53674a-host\") pod \"node-ca-b9wkt\" (UID: \"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\") " pod="openshift-image-registry/node-ca-b9wkt" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.259452 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.276959 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.291196 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.305787 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.305851 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.305872 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.305896 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.305910 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:30Z","lastTransitionTime":"2025-10-01T12:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.308590 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.323744 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.345775 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.352898 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/10dcb95f-031f-4e4c-bf15-0c8e1b53674a-serviceca\") pod \"node-ca-b9wkt\" (UID: \"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\") " pod="openshift-image-registry/node-ca-b9wkt" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.352943 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml7nw\" (UniqueName: \"kubernetes.io/projected/10dcb95f-031f-4e4c-bf15-0c8e1b53674a-kube-api-access-ml7nw\") pod \"node-ca-b9wkt\" (UID: \"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\") " pod="openshift-image-registry/node-ca-b9wkt" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.352973 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10dcb95f-031f-4e4c-bf15-0c8e1b53674a-host\") pod \"node-ca-b9wkt\" (UID: \"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\") " pod="openshift-image-registry/node-ca-b9wkt" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.353065 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10dcb95f-031f-4e4c-bf15-0c8e1b53674a-host\") pod \"node-ca-b9wkt\" (UID: \"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\") " pod="openshift-image-registry/node-ca-b9wkt" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.354394 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/10dcb95f-031f-4e4c-bf15-0c8e1b53674a-serviceca\") pod \"node-ca-b9wkt\" (UID: \"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\") " pod="openshift-image-registry/node-ca-b9wkt" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.370689 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.371459 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:30 crc kubenswrapper[4727]: E1001 12:37:30.371624 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.380273 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml7nw\" (UniqueName: \"kubernetes.io/projected/10dcb95f-031f-4e4c-bf15-0c8e1b53674a-kube-api-access-ml7nw\") pod \"node-ca-b9wkt\" (UID: \"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\") " pod="openshift-image-registry/node-ca-b9wkt" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.387933 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.404643 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.409652 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.409777 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.409861 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.409981 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.410094 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:30Z","lastTransitionTime":"2025-10-01T12:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.429813 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-b9wkt" Oct 01 12:37:30 crc kubenswrapper[4727]: W1001 12:37:30.444891 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10dcb95f_031f_4e4c_bf15_0c8e1b53674a.slice/crio-97378dd51f2d927e6830f57462c617c8dc947799bf56bbe54e42805d4261d677 WatchSource:0}: Error finding container 97378dd51f2d927e6830f57462c617c8dc947799bf56bbe54e42805d4261d677: Status 404 returned error can't find the container with id 97378dd51f2d927e6830f57462c617c8dc947799bf56bbe54e42805d4261d677 Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.513336 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.513413 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.513434 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.513466 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.513540 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:30Z","lastTransitionTime":"2025-10-01T12:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.595034 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" event={"ID":"a908511b-2ce2-4a11-8dad-3867bee13f57","Type":"ContainerStarted","Data":"69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df"} Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.596032 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-b9wkt" event={"ID":"10dcb95f-031f-4e4c-bf15-0c8e1b53674a","Type":"ContainerStarted","Data":"97378dd51f2d927e6830f57462c617c8dc947799bf56bbe54e42805d4261d677"} Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.598678 4727 generic.go:334] "Generic (PLEG): container finished" podID="1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee" containerID="446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44" exitCode=0 Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.598711 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" event={"ID":"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee","Type":"ContainerDied","Data":"446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44"} Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.611219 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.617349 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.617386 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.617395 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.617411 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.617420 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:30Z","lastTransitionTime":"2025-10-01T12:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.624540 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.637140 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.649193 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.669153 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.686358 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.701106 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.714719 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.719251 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.719284 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.719293 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.719308 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.719317 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:30Z","lastTransitionTime":"2025-10-01T12:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.734059 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.746822 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.759128 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.776804 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.789594 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.806554 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.818935 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:30Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.821911 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.821959 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.821971 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.821987 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.821995 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:30Z","lastTransitionTime":"2025-10-01T12:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.925231 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.925307 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.925327 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.925354 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.925376 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:30Z","lastTransitionTime":"2025-10-01T12:37:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:30 crc kubenswrapper[4727]: I1001 12:37:30.958095 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:37:30 crc kubenswrapper[4727]: E1001 12:37:30.958357 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:37:38.958330316 +0000 UTC m=+37.279685153 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.028568 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.028627 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.028641 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.028661 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.028674 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:31Z","lastTransitionTime":"2025-10-01T12:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.059052 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.059115 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.059144 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.059166 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:31 crc kubenswrapper[4727]: E1001 12:37:31.059273 4727 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:37:31 crc kubenswrapper[4727]: E1001 12:37:31.059315 4727 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:37:31 crc kubenswrapper[4727]: E1001 12:37:31.059360 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:39.059335949 +0000 UTC m=+37.380690786 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:37:31 crc kubenswrapper[4727]: E1001 12:37:31.059407 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:37:31 crc kubenswrapper[4727]: E1001 12:37:31.059462 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:39.059436892 +0000 UTC m=+37.380791719 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:37:31 crc kubenswrapper[4727]: E1001 12:37:31.059476 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:37:31 crc kubenswrapper[4727]: E1001 12:37:31.059497 4727 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:31 crc kubenswrapper[4727]: E1001 12:37:31.059492 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:37:31 crc kubenswrapper[4727]: E1001 12:37:31.059564 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:37:31 crc kubenswrapper[4727]: E1001 12:37:31.059581 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:39.059555476 +0000 UTC m=+37.380910483 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:31 crc kubenswrapper[4727]: E1001 12:37:31.059588 4727 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:31 crc kubenswrapper[4727]: E1001 12:37:31.059688 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:39.059657259 +0000 UTC m=+37.381012126 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.132194 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.132245 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.132256 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.132272 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.132282 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:31Z","lastTransitionTime":"2025-10-01T12:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.235560 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.235610 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.235624 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.235644 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.235657 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:31Z","lastTransitionTime":"2025-10-01T12:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.337573 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.337634 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.337653 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.337675 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.337691 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:31Z","lastTransitionTime":"2025-10-01T12:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.371418 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.371433 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:37:31 crc kubenswrapper[4727]: E1001 12:37:31.371625 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:37:31 crc kubenswrapper[4727]: E1001 12:37:31.371752 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.440906 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.440967 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.440985 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.441133 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.441155 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:31Z","lastTransitionTime":"2025-10-01T12:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.544116 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.544184 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.544202 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.544228 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.544246 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:31Z","lastTransitionTime":"2025-10-01T12:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.605692 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-b9wkt" event={"ID":"10dcb95f-031f-4e4c-bf15-0c8e1b53674a","Type":"ContainerStarted","Data":"375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6"} Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.610253 4727 generic.go:334] "Generic (PLEG): container finished" podID="1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee" containerID="013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd" exitCode=0 Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.610312 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" event={"ID":"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee","Type":"ContainerDied","Data":"013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd"} Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.641191 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:31Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.646798 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.646838 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.646850 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.646868 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.646889 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:31Z","lastTransitionTime":"2025-10-01T12:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.666116 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:31Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.683492 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:31Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.704162 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:31Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.722054 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:31Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.743487 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:31Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.749547 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.749610 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.749634 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.749660 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.749678 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:31Z","lastTransitionTime":"2025-10-01T12:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.771164 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:31Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.792559 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:31Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.808631 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:31Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.826317 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:31Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.843389 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:31Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.852465 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.852507 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.852517 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.852535 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.852549 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:31Z","lastTransitionTime":"2025-10-01T12:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.858121 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:31Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.871526 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:31Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.887844 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:31Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.906923 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:31Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.923208 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:31Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.937700 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:31Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.953511 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:31Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.956056 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.956081 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.956092 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.956109 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.956120 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:31Z","lastTransitionTime":"2025-10-01T12:37:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.971135 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:31Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:31 crc kubenswrapper[4727]: I1001 12:37:31.992741 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:31Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.010176 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.028569 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.047702 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.059582 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.059653 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.059668 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.059690 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.059705 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:32Z","lastTransitionTime":"2025-10-01T12:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.066526 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.085206 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.100811 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.122514 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.139902 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.156478 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.162657 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.162702 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.162714 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.162731 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.162742 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:32Z","lastTransitionTime":"2025-10-01T12:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.167740 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.266292 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.266344 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.266355 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.266371 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.266381 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:32Z","lastTransitionTime":"2025-10-01T12:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.368898 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.368944 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.368958 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.368976 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.368989 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:32Z","lastTransitionTime":"2025-10-01T12:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.372869 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:32 crc kubenswrapper[4727]: E1001 12:37:32.373036 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.396246 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.411952 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.427162 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.442518 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.463063 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.471165 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.471192 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.471201 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.471213 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.471224 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:32Z","lastTransitionTime":"2025-10-01T12:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.476981 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.497190 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.509763 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.521591 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.531218 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.540513 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.567783 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.579577 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.579621 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.579635 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.579657 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.579673 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:32Z","lastTransitionTime":"2025-10-01T12:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.616171 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" event={"ID":"a908511b-2ce2-4a11-8dad-3867bee13f57","Type":"ContainerStarted","Data":"0fdcefb7ed6231118a5caccd5654294331a1288086d0198466a9dedc55b881af"} Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.617190 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.618932 4727 generic.go:334] "Generic (PLEG): container finished" podID="1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee" containerID="c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f" exitCode=0 Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.619687 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" event={"ID":"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee","Type":"ContainerDied","Data":"c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f"} Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.632088 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.641877 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.646645 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.658372 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.670777 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.681273 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.681308 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.681320 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.681337 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.681348 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:32Z","lastTransitionTime":"2025-10-01T12:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.682462 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.693432 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.703240 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.720689 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdcefb7ed6231118a5caccd5654294331a1288086d0198466a9dedc55b881af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.734575 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.748018 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.756765 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.769992 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.782288 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.783960 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.783992 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.784023 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.784040 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.784051 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:32Z","lastTransitionTime":"2025-10-01T12:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.796103 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.812778 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.838597 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.851414 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.865988 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.886573 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.886604 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.886615 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.886632 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.886644 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:32Z","lastTransitionTime":"2025-10-01T12:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.990291 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.990341 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.990352 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.990369 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:32 crc kubenswrapper[4727]: I1001 12:37:32.990385 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:32Z","lastTransitionTime":"2025-10-01T12:37:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.095027 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.095065 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.095076 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.095095 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.095107 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:33Z","lastTransitionTime":"2025-10-01T12:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.200793 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.200847 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.201083 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.201107 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.201125 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:33Z","lastTransitionTime":"2025-10-01T12:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.304568 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.304626 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.304639 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.304658 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.304670 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:33Z","lastTransitionTime":"2025-10-01T12:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.371620 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.371646 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:37:33 crc kubenswrapper[4727]: E1001 12:37:33.371808 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:37:33 crc kubenswrapper[4727]: E1001 12:37:33.371969 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.408265 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.408308 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.408317 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.408330 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.408339 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:33Z","lastTransitionTime":"2025-10-01T12:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.510945 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.510991 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.511015 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.511030 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.511040 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:33Z","lastTransitionTime":"2025-10-01T12:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.614399 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.614460 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.614479 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.614500 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.614513 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:33Z","lastTransitionTime":"2025-10-01T12:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.627880 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" event={"ID":"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee","Type":"ContainerStarted","Data":"5b50096decee04773ae4447bce8059d65900e8d0b71b7bbca98419098bcea04e"} Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.627938 4727 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.628601 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.644546 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.658500 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.662975 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b50096decee04773ae4447bce8059d65900e8d0b71b7bbca98419098bcea04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.681815 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.698142 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.718211 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.718288 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.718309 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.718335 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.718351 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:33Z","lastTransitionTime":"2025-10-01T12:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.719787 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.750041 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.765255 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.786016 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.802606 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.821940 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.822015 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.822032 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.822054 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.822070 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:33Z","lastTransitionTime":"2025-10-01T12:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.823779 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.845339 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdcefb7ed6231118a5caccd5654294331a1288086d0198466a9dedc55b881af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.906079 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.925357 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.925395 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.925403 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.925418 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.925428 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:33Z","lastTransitionTime":"2025-10-01T12:37:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.925694 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.943240 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.962394 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:33 crc kubenswrapper[4727]: I1001 12:37:33.985939 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.009861 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.027592 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.027640 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.027652 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.027669 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.027681 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:34Z","lastTransitionTime":"2025-10-01T12:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.035638 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.054145 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.076792 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.091172 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.109159 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.131411 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.131461 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.131474 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.131492 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.131506 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:34Z","lastTransitionTime":"2025-10-01T12:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.138654 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdcefb7ed6231118a5caccd5654294331a1288086d0198466a9dedc55b881af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.157410 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.179074 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.195706 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.209322 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.220978 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.234889 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.234953 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.234965 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.235014 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.235028 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:34Z","lastTransitionTime":"2025-10-01T12:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.239108 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.257358 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b50096decee04773ae4447bce8059d65900e8d0b71b7bbca98419098bcea04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.338499 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.338534 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.338558 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.338573 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.338583 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:34Z","lastTransitionTime":"2025-10-01T12:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.372191 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:34 crc kubenswrapper[4727]: E1001 12:37:34.372323 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.441039 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.441096 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.441113 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.441145 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.441160 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:34Z","lastTransitionTime":"2025-10-01T12:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.544672 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.544728 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.544746 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.544767 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.544780 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:34Z","lastTransitionTime":"2025-10-01T12:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.632380 4727 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.648020 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.648093 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.648108 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.648131 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.648147 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:34Z","lastTransitionTime":"2025-10-01T12:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.751775 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.751861 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.751887 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.751919 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.751937 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:34Z","lastTransitionTime":"2025-10-01T12:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.855621 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.855709 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.855733 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.855769 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.855794 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:34Z","lastTransitionTime":"2025-10-01T12:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.958945 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.959014 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.959033 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.959054 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:34 crc kubenswrapper[4727]: I1001 12:37:34.959066 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:34Z","lastTransitionTime":"2025-10-01T12:37:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.061744 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.061787 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.061798 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.061813 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.061824 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:35Z","lastTransitionTime":"2025-10-01T12:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.163879 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.163921 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.163934 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.163950 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.163960 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:35Z","lastTransitionTime":"2025-10-01T12:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.266633 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.267024 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.267470 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.267577 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.267645 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:35Z","lastTransitionTime":"2025-10-01T12:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.370416 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.370780 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.370864 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.370966 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.371080 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:35Z","lastTransitionTime":"2025-10-01T12:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.371270 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:37:35 crc kubenswrapper[4727]: E1001 12:37:35.371403 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.371287 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:37:35 crc kubenswrapper[4727]: E1001 12:37:35.371518 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.474073 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.474106 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.474116 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.474130 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.474138 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:35Z","lastTransitionTime":"2025-10-01T12:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.576626 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.576681 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.576695 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.576714 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.576726 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:35Z","lastTransitionTime":"2025-10-01T12:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.636097 4727 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.683738 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.683799 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.683829 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.683854 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.683874 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:35Z","lastTransitionTime":"2025-10-01T12:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.787046 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.787100 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.787112 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.787131 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.787144 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:35Z","lastTransitionTime":"2025-10-01T12:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.889744 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.889813 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.889831 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.889855 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.889874 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:35Z","lastTransitionTime":"2025-10-01T12:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.993196 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.993265 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.993284 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.993311 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:35 crc kubenswrapper[4727]: I1001 12:37:35.993334 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:35Z","lastTransitionTime":"2025-10-01T12:37:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.096861 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.096930 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.096948 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.096973 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.096992 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:36Z","lastTransitionTime":"2025-10-01T12:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.199948 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.200031 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.200051 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.200076 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.200092 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:36Z","lastTransitionTime":"2025-10-01T12:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.303330 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.303379 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.303397 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.303421 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.303439 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:36Z","lastTransitionTime":"2025-10-01T12:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.372685 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:36 crc kubenswrapper[4727]: E1001 12:37:36.373478 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.405963 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.406066 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.406093 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.406120 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.406141 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:36Z","lastTransitionTime":"2025-10-01T12:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.508630 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.508702 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.508727 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.508757 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.508780 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:36Z","lastTransitionTime":"2025-10-01T12:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.612167 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.612236 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.612253 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.612276 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.612293 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:36Z","lastTransitionTime":"2025-10-01T12:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.648458 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwx55_a908511b-2ce2-4a11-8dad-3867bee13f57/ovnkube-controller/0.log" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.653213 4727 generic.go:334] "Generic (PLEG): container finished" podID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerID="0fdcefb7ed6231118a5caccd5654294331a1288086d0198466a9dedc55b881af" exitCode=1 Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.653261 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" event={"ID":"a908511b-2ce2-4a11-8dad-3867bee13f57","Type":"ContainerDied","Data":"0fdcefb7ed6231118a5caccd5654294331a1288086d0198466a9dedc55b881af"} Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.654749 4727 scope.go:117] "RemoveContainer" containerID="0fdcefb7ed6231118a5caccd5654294331a1288086d0198466a9dedc55b881af" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.689751 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.709322 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.715413 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.715459 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.715471 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.715487 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.715498 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:36Z","lastTransitionTime":"2025-10-01T12:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.727392 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.749319 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.765407 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.782782 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.798293 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.819616 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.819668 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.819679 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.819695 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.819705 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:36Z","lastTransitionTime":"2025-10-01T12:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.828170 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fdcefb7ed6231118a5caccd5654294331a1288086d0198466a9dedc55b881af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fdcefb7ed6231118a5caccd5654294331a1288086d0198466a9dedc55b881af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:37:35Z\\\",\\\"message\\\":\\\"pi/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 12:37:35.585446 6015 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.585524 6015 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 12:37:35.585624 6015 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.585808 6015 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.586094 6015 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.586596 6015 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 12:37:35.586627 6015 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 12:37:35.586646 6015 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 12:37:35.586717 6015 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 12:37:35.586746 6015 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 12:37:35.586754 6015 factory.go:656] Stopping watch factory\\\\nI1001 12:37:35.586774 6015 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.844301 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.863633 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.878966 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.895829 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.912580 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.927500 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.927553 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.927567 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.927586 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.927602 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:36Z","lastTransitionTime":"2025-10-01T12:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.932774 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:36 crc kubenswrapper[4727]: I1001 12:37:36.949479 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b50096decee04773ae4447bce8059d65900e8d0b71b7bbca98419098bcea04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:36Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.030517 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.030579 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.030597 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.030620 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.030637 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:37Z","lastTransitionTime":"2025-10-01T12:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.134552 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.134609 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.134627 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.134656 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.134672 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:37Z","lastTransitionTime":"2025-10-01T12:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.238268 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.238350 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.238373 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.238405 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.238430 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:37Z","lastTransitionTime":"2025-10-01T12:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.341301 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.341368 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.341388 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.341415 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.341433 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:37Z","lastTransitionTime":"2025-10-01T12:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.371618 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.371640 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:37:37 crc kubenswrapper[4727]: E1001 12:37:37.371806 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:37:37 crc kubenswrapper[4727]: E1001 12:37:37.371979 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.444565 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.444618 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.444637 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.444662 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.444680 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:37Z","lastTransitionTime":"2025-10-01T12:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.547285 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.547353 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.547377 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.547406 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.547428 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:37Z","lastTransitionTime":"2025-10-01T12:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.650051 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.650091 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.650104 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.650122 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.650132 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:37Z","lastTransitionTime":"2025-10-01T12:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.658016 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwx55_a908511b-2ce2-4a11-8dad-3867bee13f57/ovnkube-controller/0.log" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.661411 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" event={"ID":"a908511b-2ce2-4a11-8dad-3867bee13f57","Type":"ContainerStarted","Data":"5dd79e28fd048f20b42f218511020816b360e3e2494f47613caa63852df78639"} Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.661528 4727 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.678941 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.690364 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.703636 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.722068 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b50096decee04773ae4447bce8059d65900e8d0b71b7bbca98419098bcea04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.752891 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.752932 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.752943 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.752960 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.752972 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:37Z","lastTransitionTime":"2025-10-01T12:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.754326 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.768596 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.786795 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.808723 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd"] Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.809244 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.811552 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.811609 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.821137 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd79e28fd048f20b42f218511020816b360e3e2494f47613caa63852df78639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fdcefb7ed6231118a5caccd5654294331a1288086d0198466a9dedc55b881af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:37:35Z\\\",\\\"message\\\":\\\"pi/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 12:37:35.585446 6015 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.585524 6015 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 12:37:35.585624 6015 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.585808 6015 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.586094 6015 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.586596 6015 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 12:37:35.586627 6015 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 12:37:35.586646 6015 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 12:37:35.586717 6015 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 12:37:35.586746 6015 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 12:37:35.586754 6015 factory.go:656] Stopping watch factory\\\\nI1001 12:37:35.586774 6015 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.837202 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bed486c6-587b-40ec-a908-064c3623b893-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gfkfd\" (UID: \"bed486c6-587b-40ec-a908-064c3623b893\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.837256 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twp42\" (UniqueName: \"kubernetes.io/projected/bed486c6-587b-40ec-a908-064c3623b893-kube-api-access-twp42\") pod \"ovnkube-control-plane-749d76644c-gfkfd\" (UID: \"bed486c6-587b-40ec-a908-064c3623b893\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.837281 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bed486c6-587b-40ec-a908-064c3623b893-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gfkfd\" (UID: \"bed486c6-587b-40ec-a908-064c3623b893\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.837316 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bed486c6-587b-40ec-a908-064c3623b893-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gfkfd\" (UID: \"bed486c6-587b-40ec-a908-064c3623b893\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.837968 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.855514 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.855549 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.855560 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.855578 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.855590 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:37Z","lastTransitionTime":"2025-10-01T12:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.857244 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.877321 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.893623 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.916204 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.932029 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.938106 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bed486c6-587b-40ec-a908-064c3623b893-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gfkfd\" (UID: \"bed486c6-587b-40ec-a908-064c3623b893\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.938185 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bed486c6-587b-40ec-a908-064c3623b893-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gfkfd\" (UID: \"bed486c6-587b-40ec-a908-064c3623b893\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.938237 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twp42\" (UniqueName: \"kubernetes.io/projected/bed486c6-587b-40ec-a908-064c3623b893-kube-api-access-twp42\") pod \"ovnkube-control-plane-749d76644c-gfkfd\" (UID: \"bed486c6-587b-40ec-a908-064c3623b893\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.938276 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bed486c6-587b-40ec-a908-064c3623b893-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gfkfd\" (UID: \"bed486c6-587b-40ec-a908-064c3623b893\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.939201 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bed486c6-587b-40ec-a908-064c3623b893-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gfkfd\" (UID: \"bed486c6-587b-40ec-a908-064c3623b893\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.939207 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bed486c6-587b-40ec-a908-064c3623b893-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gfkfd\" (UID: \"bed486c6-587b-40ec-a908-064c3623b893\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.944410 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.946596 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bed486c6-587b-40ec-a908-064c3623b893-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gfkfd\" (UID: \"bed486c6-587b-40ec-a908-064c3623b893\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.954412 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twp42\" (UniqueName: \"kubernetes.io/projected/bed486c6-587b-40ec-a908-064c3623b893-kube-api-access-twp42\") pod \"ovnkube-control-plane-749d76644c-gfkfd\" (UID: \"bed486c6-587b-40ec-a908-064c3623b893\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.957754 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.957793 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.957807 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.957832 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.957846 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:37Z","lastTransitionTime":"2025-10-01T12:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.959520 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed486c6-587b-40ec-a908-064c3623b893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gfkfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.979520 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:37 crc kubenswrapper[4727]: I1001 12:37:37.993691 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.007226 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.020912 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.041427 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd79e28fd048f20b42f218511020816b360e3e2494f47613caa63852df78639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fdcefb7ed6231118a5caccd5654294331a1288086d0198466a9dedc55b881af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:37:35Z\\\",\\\"message\\\":\\\"pi/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 12:37:35.585446 6015 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.585524 6015 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 12:37:35.585624 6015 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.585808 6015 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.586094 6015 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.586596 6015 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 12:37:35.586627 6015 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 12:37:35.586646 6015 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 12:37:35.586717 6015 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 12:37:35.586746 6015 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 12:37:35.586754 6015 factory.go:656] Stopping watch factory\\\\nI1001 12:37:35.586774 6015 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.054033 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.060402 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.060564 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.060673 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.060760 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.060847 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:38Z","lastTransitionTime":"2025-10-01T12:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.073410 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.092479 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.109392 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.122353 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.124877 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.136487 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.154367 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.163580 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.163633 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.163649 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.163671 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.163686 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:38Z","lastTransitionTime":"2025-10-01T12:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.169838 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.187407 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.208431 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b50096decee04773ae4447bce8059d65900e8d0b71b7bbca98419098bcea04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.266592 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.266639 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.266653 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.266670 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.266680 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:38Z","lastTransitionTime":"2025-10-01T12:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.369224 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.369285 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.369299 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.369319 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.369336 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:38Z","lastTransitionTime":"2025-10-01T12:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.371549 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:38 crc kubenswrapper[4727]: E1001 12:37:38.371693 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.471616 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.471648 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.471659 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.471674 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.471688 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:38Z","lastTransitionTime":"2025-10-01T12:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.573820 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.573866 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.573877 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.573891 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.573901 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:38Z","lastTransitionTime":"2025-10-01T12:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.669673 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" event={"ID":"bed486c6-587b-40ec-a908-064c3623b893","Type":"ContainerStarted","Data":"d1cabea4e68fe88d8cd24753367f3f9d696c0d6f8afd244ae6f4e1d3890d856a"} Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.669766 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" event={"ID":"bed486c6-587b-40ec-a908-064c3623b893","Type":"ContainerStarted","Data":"8f28ab4deda37f2d065260409ffad7f3fd032a10ba6559420d948b94f0549e60"} Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.669797 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" event={"ID":"bed486c6-587b-40ec-a908-064c3623b893","Type":"ContainerStarted","Data":"9aa791680f0bbc2812d4be3ec7f253e09bd8cec37c8c227585b1463c67a93e34"} Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.672487 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwx55_a908511b-2ce2-4a11-8dad-3867bee13f57/ovnkube-controller/1.log" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.673115 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwx55_a908511b-2ce2-4a11-8dad-3867bee13f57/ovnkube-controller/0.log" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.675359 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.675395 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.675407 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.675425 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.675436 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:38Z","lastTransitionTime":"2025-10-01T12:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.677033 4727 generic.go:334] "Generic (PLEG): container finished" podID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerID="5dd79e28fd048f20b42f218511020816b360e3e2494f47613caa63852df78639" exitCode=1 Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.677070 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" event={"ID":"a908511b-2ce2-4a11-8dad-3867bee13f57","Type":"ContainerDied","Data":"5dd79e28fd048f20b42f218511020816b360e3e2494f47613caa63852df78639"} Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.677127 4727 scope.go:117] "RemoveContainer" containerID="0fdcefb7ed6231118a5caccd5654294331a1288086d0198466a9dedc55b881af" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.677868 4727 scope.go:117] "RemoveContainer" containerID="5dd79e28fd048f20b42f218511020816b360e3e2494f47613caa63852df78639" Oct 01 12:37:38 crc kubenswrapper[4727]: E1001 12:37:38.678059 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pwx55_openshift-ovn-kubernetes(a908511b-2ce2-4a11-8dad-3867bee13f57)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.694712 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.715548 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.736080 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed486c6-587b-40ec-a908-064c3623b893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f28ab4deda37f2d065260409ffad7f3fd032a10ba6559420d948b94f0549e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1cabea4e68fe88d8cd24753367f3f9d696c0d6f8afd244ae6f4e1d3890d856a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gfkfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.765162 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.780912 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.781877 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.781971 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.782022 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.782050 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.782071 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:38Z","lastTransitionTime":"2025-10-01T12:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.796566 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.810741 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.832727 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd79e28fd048f20b42f218511020816b360e3e2494f47613caa63852df78639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fdcefb7ed6231118a5caccd5654294331a1288086d0198466a9dedc55b881af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:37:35Z\\\",\\\"message\\\":\\\"pi/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 12:37:35.585446 6015 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.585524 6015 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 12:37:35.585624 6015 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.585808 6015 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.586094 6015 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.586596 6015 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 12:37:35.586627 6015 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 12:37:35.586646 6015 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 12:37:35.586717 6015 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 12:37:35.586746 6015 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 12:37:35.586754 6015 factory.go:656] Stopping watch factory\\\\nI1001 12:37:35.586774 6015 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.850921 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.867874 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.883833 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.885093 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.885117 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.885128 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.885144 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.885156 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:38Z","lastTransitionTime":"2025-10-01T12:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.902825 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.920209 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.942680 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.987445 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.987497 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.987509 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.987527 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.987540 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:38Z","lastTransitionTime":"2025-10-01T12:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:38 crc kubenswrapper[4727]: I1001 12:37:38.987583 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b50096decee04773ae4447bce8059d65900e8d0b71b7bbca98419098bcea04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.002756 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.015918 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.030081 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.039247 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.049842 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.051077 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:37:39 crc kubenswrapper[4727]: E1001 12:37:39.051244 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:37:55.051223527 +0000 UTC m=+53.372578364 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.059012 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.068600 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.082100 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b50096decee04773ae4447bce8059d65900e8d0b71b7bbca98419098bcea04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.090210 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.090261 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.090270 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.090286 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.090296 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:39Z","lastTransitionTime":"2025-10-01T12:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.103863 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.121335 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.139045 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.151647 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.151683 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.151708 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:37:39 crc kubenswrapper[4727]: E1001 12:37:39.151788 4727 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:37:39 crc kubenswrapper[4727]: E1001 12:37:39.151820 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:37:39 crc kubenswrapper[4727]: E1001 12:37:39.151841 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:37:39 crc kubenswrapper[4727]: E1001 12:37:39.151855 4727 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:37:39 crc kubenswrapper[4727]: E1001 12:37:39.151863 4727 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:39 crc kubenswrapper[4727]: E1001 12:37:39.151856 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:55.151837969 +0000 UTC m=+53.473192806 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.151727 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:39 crc kubenswrapper[4727]: E1001 12:37:39.151893 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:55.151885041 +0000 UTC m=+53.473239878 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:37:39 crc kubenswrapper[4727]: E1001 12:37:39.151926 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:55.151898301 +0000 UTC m=+53.473253138 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:39 crc kubenswrapper[4727]: E1001 12:37:39.152044 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:37:39 crc kubenswrapper[4727]: E1001 12:37:39.152112 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:37:39 crc kubenswrapper[4727]: E1001 12:37:39.152138 4727 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:39 crc kubenswrapper[4727]: E1001 12:37:39.152248 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:55.15221889 +0000 UTC m=+53.473573767 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.153887 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed486c6-587b-40ec-a908-064c3623b893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f28ab4deda37f2d065260409ffad7f3fd032a10ba6559420d948b94f0549e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1cabea4e68fe88d8cd24753367f3f9d696c0d6f8afd244ae6f4e1d3890d856a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gfkfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.175607 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd79e28fd048f20b42f218511020816b360e3e2494f47613caa63852df78639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fdcefb7ed6231118a5caccd5654294331a1288086d0198466a9dedc55b881af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:37:35Z\\\",\\\"message\\\":\\\"pi/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 12:37:35.585446 6015 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.585524 6015 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 12:37:35.585624 6015 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.585808 6015 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.586094 6015 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.586596 6015 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 12:37:35.586627 6015 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 12:37:35.586646 6015 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 12:37:35.586717 6015 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 12:37:35.586746 6015 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 12:37:35.586754 6015 factory.go:656] Stopping watch factory\\\\nI1001 12:37:35.586774 6015 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dd79e28fd048f20b42f218511020816b360e3e2494f47613caa63852df78639\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"message\\\":\\\"\\\\\\\"10.217.4.10\\\\\\\", Port:9154, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}, services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_UDP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"UDP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nI1001 12:37:37.918925 6163 services_controller.go:453] Built service openshift-dns/dns-default template LB for network=default: []services.LB{}\\\\nI1001 12:37:37.918941 6163 services_controller.go:454] Service openshift-dns/dns-default for network=default has 0 cluster-wide, 3 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nF1001 12:37:37.918966 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.193012 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.193055 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.193065 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.193082 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.193091 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:39Z","lastTransitionTime":"2025-10-01T12:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.197413 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.212078 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.223875 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.237619 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.295939 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.296023 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.296039 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.296056 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.296066 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:39Z","lastTransitionTime":"2025-10-01T12:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.371953 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:37:39 crc kubenswrapper[4727]: E1001 12:37:39.372129 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.371954 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:37:39 crc kubenswrapper[4727]: E1001 12:37:39.372422 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.400552 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.400621 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.400637 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.400661 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.400680 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:39Z","lastTransitionTime":"2025-10-01T12:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.503909 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.504311 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.504390 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.504483 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.504559 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:39Z","lastTransitionTime":"2025-10-01T12:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.608113 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.608161 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.608175 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.608195 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.608210 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:39Z","lastTransitionTime":"2025-10-01T12:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.676487 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-tvtzh"] Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.677131 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:37:39 crc kubenswrapper[4727]: E1001 12:37:39.677213 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.685872 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwx55_a908511b-2ce2-4a11-8dad-3867bee13f57/ovnkube-controller/1.log" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.692605 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.692662 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.692675 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.692692 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.692708 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:39Z","lastTransitionTime":"2025-10-01T12:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.699942 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: E1001 12:37:39.711501 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.717397 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.717456 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.717470 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.717496 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.717512 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:39Z","lastTransitionTime":"2025-10-01T12:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.718266 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.737149 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed486c6-587b-40ec-a908-064c3623b893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f28ab4deda37f2d065260409ffad7f3fd032a10ba6559420d948b94f0549e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1cabea4e68fe88d8cd24753367f3f9d696c0d6f8afd244ae6f4e1d3890d856a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gfkfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: E1001 12:37:39.742576 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.747382 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.747419 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.747432 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.747451 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.747466 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:39Z","lastTransitionTime":"2025-10-01T12:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.752890 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tvtzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7f4ab8d-5f57-47bd-93fc-9219c596c436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tvtzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.759369 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7f4ab8d-5f57-47bd-93fc-9219c596c436-metrics-certs\") pod \"network-metrics-daemon-tvtzh\" (UID: \"f7f4ab8d-5f57-47bd-93fc-9219c596c436\") " pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.759433 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ljxj\" (UniqueName: \"kubernetes.io/projected/f7f4ab8d-5f57-47bd-93fc-9219c596c436-kube-api-access-4ljxj\") pod \"network-metrics-daemon-tvtzh\" (UID: \"f7f4ab8d-5f57-47bd-93fc-9219c596c436\") " pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:37:39 crc kubenswrapper[4727]: E1001 12:37:39.767355 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.771429 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.771487 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.771498 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.771521 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.771534 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:39Z","lastTransitionTime":"2025-10-01T12:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.785365 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: E1001 12:37:39.789918 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.795083 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.795139 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.795159 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.795183 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.795200 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:39Z","lastTransitionTime":"2025-10-01T12:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.802233 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: E1001 12:37:39.812235 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: E1001 12:37:39.812401 4727 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.814506 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.814545 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.814564 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.814587 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.814603 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:39Z","lastTransitionTime":"2025-10-01T12:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.819366 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.834429 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.854452 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd79e28fd048f20b42f218511020816b360e3e2494f47613caa63852df78639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fdcefb7ed6231118a5caccd5654294331a1288086d0198466a9dedc55b881af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:37:35Z\\\",\\\"message\\\":\\\"pi/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 12:37:35.585446 6015 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.585524 6015 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 12:37:35.585624 6015 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.585808 6015 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.586094 6015 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.586596 6015 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 12:37:35.586627 6015 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 12:37:35.586646 6015 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 12:37:35.586717 6015 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 12:37:35.586746 6015 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 12:37:35.586754 6015 factory.go:656] Stopping watch factory\\\\nI1001 12:37:35.586774 6015 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dd79e28fd048f20b42f218511020816b360e3e2494f47613caa63852df78639\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"message\\\":\\\"\\\\\\\"10.217.4.10\\\\\\\", Port:9154, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}, services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_UDP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"UDP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nI1001 12:37:37.918925 6163 services_controller.go:453] Built service openshift-dns/dns-default template LB for network=default: []services.LB{}\\\\nI1001 12:37:37.918941 6163 services_controller.go:454] Service openshift-dns/dns-default for network=default has 0 cluster-wide, 3 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nF1001 12:37:37.918966 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.860367 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7f4ab8d-5f57-47bd-93fc-9219c596c436-metrics-certs\") pod \"network-metrics-daemon-tvtzh\" (UID: \"f7f4ab8d-5f57-47bd-93fc-9219c596c436\") " pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.860425 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ljxj\" (UniqueName: \"kubernetes.io/projected/f7f4ab8d-5f57-47bd-93fc-9219c596c436-kube-api-access-4ljxj\") pod \"network-metrics-daemon-tvtzh\" (UID: \"f7f4ab8d-5f57-47bd-93fc-9219c596c436\") " pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:37:39 crc kubenswrapper[4727]: E1001 12:37:39.860648 4727 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:37:39 crc kubenswrapper[4727]: E1001 12:37:39.860809 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7f4ab8d-5f57-47bd-93fc-9219c596c436-metrics-certs podName:f7f4ab8d-5f57-47bd-93fc-9219c596c436 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:40.360773292 +0000 UTC m=+38.682128349 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7f4ab8d-5f57-47bd-93fc-9219c596c436-metrics-certs") pod "network-metrics-daemon-tvtzh" (UID: "f7f4ab8d-5f57-47bd-93fc-9219c596c436") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.876233 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.891219 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ljxj\" (UniqueName: \"kubernetes.io/projected/f7f4ab8d-5f57-47bd-93fc-9219c596c436-kube-api-access-4ljxj\") pod \"network-metrics-daemon-tvtzh\" (UID: \"f7f4ab8d-5f57-47bd-93fc-9219c596c436\") " pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.891489 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.904793 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.917539 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.917585 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.917596 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.917613 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.917626 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:39Z","lastTransitionTime":"2025-10-01T12:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.921860 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.934692 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.945695 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.967106 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b50096decee04773ae4447bce8059d65900e8d0b71b7bbca98419098bcea04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:39 crc kubenswrapper[4727]: I1001 12:37:39.986586 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.020212 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.020270 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.020288 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.020313 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.020330 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:40Z","lastTransitionTime":"2025-10-01T12:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.123856 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.123950 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.123977 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.124051 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.124084 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:40Z","lastTransitionTime":"2025-10-01T12:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.227823 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.227867 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.227876 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.227896 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.227908 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:40Z","lastTransitionTime":"2025-10-01T12:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.331810 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.332181 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.332301 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.332329 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.332349 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:40Z","lastTransitionTime":"2025-10-01T12:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.365421 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7f4ab8d-5f57-47bd-93fc-9219c596c436-metrics-certs\") pod \"network-metrics-daemon-tvtzh\" (UID: \"f7f4ab8d-5f57-47bd-93fc-9219c596c436\") " pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:37:40 crc kubenswrapper[4727]: E1001 12:37:40.365632 4727 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:37:40 crc kubenswrapper[4727]: E1001 12:37:40.365777 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7f4ab8d-5f57-47bd-93fc-9219c596c436-metrics-certs podName:f7f4ab8d-5f57-47bd-93fc-9219c596c436 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:41.36572386 +0000 UTC m=+39.687078737 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7f4ab8d-5f57-47bd-93fc-9219c596c436-metrics-certs") pod "network-metrics-daemon-tvtzh" (UID: "f7f4ab8d-5f57-47bd-93fc-9219c596c436") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.372188 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:40 crc kubenswrapper[4727]: E1001 12:37:40.372368 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.436350 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.436435 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.436460 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.436490 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.436516 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:40Z","lastTransitionTime":"2025-10-01T12:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.515068 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.540303 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.540360 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.540317 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.540377 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.540644 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.540673 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:40Z","lastTransitionTime":"2025-10-01T12:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.563167 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.582972 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.606139 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.628112 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.644778 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.644859 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.644888 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.644923 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.644987 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:40Z","lastTransitionTime":"2025-10-01T12:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.650314 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.674727 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b50096decee04773ae4447bce8059d65900e8d0b71b7bbca98419098bcea04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.708136 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.729412 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.747525 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.747847 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.747947 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.748102 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.748200 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:40Z","lastTransitionTime":"2025-10-01T12:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.753066 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.774228 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed486c6-587b-40ec-a908-064c3623b893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f28ab4deda37f2d065260409ffad7f3fd032a10ba6559420d948b94f0549e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1cabea4e68fe88d8cd24753367f3f9d696c0d6f8afd244ae6f4e1d3890d856a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gfkfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.791891 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tvtzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7f4ab8d-5f57-47bd-93fc-9219c596c436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tvtzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.810730 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.828629 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.848676 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.851456 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.851516 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.851528 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.851547 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.851558 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:40Z","lastTransitionTime":"2025-10-01T12:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.871543 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.896705 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd79e28fd048f20b42f218511020816b360e3e2494f47613caa63852df78639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fdcefb7ed6231118a5caccd5654294331a1288086d0198466a9dedc55b881af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:37:35Z\\\",\\\"message\\\":\\\"pi/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 12:37:35.585446 6015 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.585524 6015 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 12:37:35.585624 6015 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.585808 6015 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.586094 6015 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.586596 6015 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 12:37:35.586627 6015 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 12:37:35.586646 6015 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 12:37:35.586717 6015 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 12:37:35.586746 6015 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 12:37:35.586754 6015 factory.go:656] Stopping watch factory\\\\nI1001 12:37:35.586774 6015 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dd79e28fd048f20b42f218511020816b360e3e2494f47613caa63852df78639\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"message\\\":\\\"\\\\\\\"10.217.4.10\\\\\\\", Port:9154, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}, services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_UDP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"UDP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nI1001 12:37:37.918925 6163 services_controller.go:453] Built service openshift-dns/dns-default template LB for network=default: []services.LB{}\\\\nI1001 12:37:37.918941 6163 services_controller.go:454] Service openshift-dns/dns-default for network=default has 0 cluster-wide, 3 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nF1001 12:37:37.918966 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.954152 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.954231 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.954256 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.954284 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:40 crc kubenswrapper[4727]: I1001 12:37:40.954303 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:40Z","lastTransitionTime":"2025-10-01T12:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.058614 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.059120 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.059306 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.059493 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.059652 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:41Z","lastTransitionTime":"2025-10-01T12:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.165113 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.165194 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.165221 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.165256 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.165284 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:41Z","lastTransitionTime":"2025-10-01T12:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.269598 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.269680 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.269705 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.269740 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.269763 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:41Z","lastTransitionTime":"2025-10-01T12:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.371838 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.372439 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.372560 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:37:41 crc kubenswrapper[4727]: E1001 12:37:41.372181 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:37:41 crc kubenswrapper[4727]: E1001 12:37:41.373066 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:37:41 crc kubenswrapper[4727]: E1001 12:37:41.373167 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.373336 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.373554 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.373697 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.373827 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.373952 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:41Z","lastTransitionTime":"2025-10-01T12:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.377197 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7f4ab8d-5f57-47bd-93fc-9219c596c436-metrics-certs\") pod \"network-metrics-daemon-tvtzh\" (UID: \"f7f4ab8d-5f57-47bd-93fc-9219c596c436\") " pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:37:41 crc kubenswrapper[4727]: E1001 12:37:41.377412 4727 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:37:41 crc kubenswrapper[4727]: E1001 12:37:41.377539 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7f4ab8d-5f57-47bd-93fc-9219c596c436-metrics-certs podName:f7f4ab8d-5f57-47bd-93fc-9219c596c436 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:43.3775049 +0000 UTC m=+41.698859767 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7f4ab8d-5f57-47bd-93fc-9219c596c436-metrics-certs") pod "network-metrics-daemon-tvtzh" (UID: "f7f4ab8d-5f57-47bd-93fc-9219c596c436") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.478263 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.478693 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.478821 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.478965 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.479400 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:41Z","lastTransitionTime":"2025-10-01T12:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.582126 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.582205 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.582227 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.582259 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.582280 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:41Z","lastTransitionTime":"2025-10-01T12:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.686391 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.686456 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.686474 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.686500 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.686519 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:41Z","lastTransitionTime":"2025-10-01T12:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.790326 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.790375 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.790385 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.790402 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.790413 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:41Z","lastTransitionTime":"2025-10-01T12:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.893028 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.893341 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.893436 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.893516 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.893605 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:41Z","lastTransitionTime":"2025-10-01T12:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.996642 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.996993 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.997273 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.997438 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:41 crc kubenswrapper[4727]: I1001 12:37:41.997561 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:41Z","lastTransitionTime":"2025-10-01T12:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.101602 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.101688 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.101713 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.101744 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.101770 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:42Z","lastTransitionTime":"2025-10-01T12:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.205907 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.205967 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.205988 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.206051 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.206077 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:42Z","lastTransitionTime":"2025-10-01T12:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.309977 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.310096 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.310119 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.310153 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.310178 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:42Z","lastTransitionTime":"2025-10-01T12:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.372183 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:42 crc kubenswrapper[4727]: E1001 12:37:42.372425 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.390120 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.408883 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed486c6-587b-40ec-a908-064c3623b893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f28ab4deda37f2d065260409ffad7f3fd032a10ba6559420d948b94f0549e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1cabea4e68fe88d8cd24753367f3f9d696c0d6f8afd244ae6f4e1d3890d856a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gfkfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.412965 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.413046 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.413060 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.413079 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.413120 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:42Z","lastTransitionTime":"2025-10-01T12:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.423794 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tvtzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7f4ab8d-5f57-47bd-93fc-9219c596c436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tvtzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.452179 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.480466 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.501665 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.516070 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.516371 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.516501 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.516643 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.516741 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:42Z","lastTransitionTime":"2025-10-01T12:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.516893 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.539949 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd79e28fd048f20b42f218511020816b360e3e2494f47613caa63852df78639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fdcefb7ed6231118a5caccd5654294331a1288086d0198466a9dedc55b881af\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:37:35Z\\\",\\\"message\\\":\\\"pi/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 12:37:35.585446 6015 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.585524 6015 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 12:37:35.585624 6015 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.585808 6015 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.586094 6015 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 12:37:35.586596 6015 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 12:37:35.586627 6015 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 12:37:35.586646 6015 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 12:37:35.586717 6015 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 12:37:35.586746 6015 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 12:37:35.586754 6015 factory.go:656] Stopping watch factory\\\\nI1001 12:37:35.586774 6015 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dd79e28fd048f20b42f218511020816b360e3e2494f47613caa63852df78639\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"message\\\":\\\"\\\\\\\"10.217.4.10\\\\\\\", Port:9154, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}, services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_UDP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"UDP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nI1001 12:37:37.918925 6163 services_controller.go:453] Built service openshift-dns/dns-default template LB for network=default: []services.LB{}\\\\nI1001 12:37:37.918941 6163 services_controller.go:454] Service openshift-dns/dns-default for network=default has 0 cluster-wide, 3 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nF1001 12:37:37.918966 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.560164 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.577406 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.590822 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.604397 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.615831 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.620052 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.620088 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.620101 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.620127 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.620141 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:42Z","lastTransitionTime":"2025-10-01T12:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.630399 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b50096decee04773ae4447bce8059d65900e8d0b71b7bbca98419098bcea04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.646922 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.661063 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.672984 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:42Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.723053 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.723112 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.723123 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.723139 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.723150 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:42Z","lastTransitionTime":"2025-10-01T12:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.826768 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.826815 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.826831 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.826852 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.826871 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:42Z","lastTransitionTime":"2025-10-01T12:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.930387 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.931070 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.931284 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.931519 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:42 crc kubenswrapper[4727]: I1001 12:37:42.931691 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:42Z","lastTransitionTime":"2025-10-01T12:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.035664 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.035748 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.035775 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.035814 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.035839 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:43Z","lastTransitionTime":"2025-10-01T12:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.139235 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.139304 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.139327 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.139360 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.139382 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:43Z","lastTransitionTime":"2025-10-01T12:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.243700 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.244284 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.244538 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.244736 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.244977 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:43Z","lastTransitionTime":"2025-10-01T12:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.347923 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.348042 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.348071 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.348112 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.348140 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:43Z","lastTransitionTime":"2025-10-01T12:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.371612 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.371624 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.371624 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:37:43 crc kubenswrapper[4727]: E1001 12:37:43.372283 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:37:43 crc kubenswrapper[4727]: E1001 12:37:43.372276 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:37:43 crc kubenswrapper[4727]: E1001 12:37:43.372422 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.401486 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7f4ab8d-5f57-47bd-93fc-9219c596c436-metrics-certs\") pod \"network-metrics-daemon-tvtzh\" (UID: \"f7f4ab8d-5f57-47bd-93fc-9219c596c436\") " pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:37:43 crc kubenswrapper[4727]: E1001 12:37:43.401777 4727 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:37:43 crc kubenswrapper[4727]: E1001 12:37:43.401879 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7f4ab8d-5f57-47bd-93fc-9219c596c436-metrics-certs podName:f7f4ab8d-5f57-47bd-93fc-9219c596c436 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:47.401844735 +0000 UTC m=+45.723199612 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7f4ab8d-5f57-47bd-93fc-9219c596c436-metrics-certs") pod "network-metrics-daemon-tvtzh" (UID: "f7f4ab8d-5f57-47bd-93fc-9219c596c436") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.452384 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.452446 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.452464 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.452491 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.452511 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:43Z","lastTransitionTime":"2025-10-01T12:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.557616 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.558234 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.558393 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.558454 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.558498 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:43Z","lastTransitionTime":"2025-10-01T12:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.662626 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.662754 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.662777 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.662805 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.662823 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:43Z","lastTransitionTime":"2025-10-01T12:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.771318 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.771394 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.771413 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.771453 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.771475 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:43Z","lastTransitionTime":"2025-10-01T12:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.874911 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.874980 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.875038 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.875068 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.875086 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:43Z","lastTransitionTime":"2025-10-01T12:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.979133 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.979194 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.979209 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.979230 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:43 crc kubenswrapper[4727]: I1001 12:37:43.979244 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:43Z","lastTransitionTime":"2025-10-01T12:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.081812 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.081868 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.081885 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.081908 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.081926 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:44Z","lastTransitionTime":"2025-10-01T12:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.185509 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.185574 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.185593 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.185619 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.185638 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:44Z","lastTransitionTime":"2025-10-01T12:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.289147 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.289232 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.289246 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.289270 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.289284 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:44Z","lastTransitionTime":"2025-10-01T12:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.371631 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:44 crc kubenswrapper[4727]: E1001 12:37:44.371916 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.391917 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.392041 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.392064 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.392091 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.392113 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:44Z","lastTransitionTime":"2025-10-01T12:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.495126 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.495195 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.495212 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.495238 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.495253 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:44Z","lastTransitionTime":"2025-10-01T12:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.598426 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.598741 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.598935 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.599118 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.599278 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:44Z","lastTransitionTime":"2025-10-01T12:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.701761 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.701800 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.701809 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.701822 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.701830 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:44Z","lastTransitionTime":"2025-10-01T12:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.805758 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.805820 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.805834 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.805858 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.805874 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:44Z","lastTransitionTime":"2025-10-01T12:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.909492 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.909567 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.909587 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.909615 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:44 crc kubenswrapper[4727]: I1001 12:37:44.909634 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:44Z","lastTransitionTime":"2025-10-01T12:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.012137 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.012229 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.012243 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.012264 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.012280 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:45Z","lastTransitionTime":"2025-10-01T12:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.115446 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.115511 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.115527 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.115548 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.115564 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:45Z","lastTransitionTime":"2025-10-01T12:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.219173 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.219271 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.219804 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.219846 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.219870 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:45Z","lastTransitionTime":"2025-10-01T12:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.323372 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.323449 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.323471 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.323509 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.323541 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:45Z","lastTransitionTime":"2025-10-01T12:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.371862 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.371901 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:37:45 crc kubenswrapper[4727]: E1001 12:37:45.372138 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.372258 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:37:45 crc kubenswrapper[4727]: E1001 12:37:45.372365 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:37:45 crc kubenswrapper[4727]: E1001 12:37:45.372516 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.427278 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.427667 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.427862 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.428103 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.428354 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:45Z","lastTransitionTime":"2025-10-01T12:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.530683 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.530725 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.530742 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.530759 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.530772 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:45Z","lastTransitionTime":"2025-10-01T12:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.633592 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.633633 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.633641 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.633655 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.633663 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:45Z","lastTransitionTime":"2025-10-01T12:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.736027 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.736089 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.736103 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.736120 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.736132 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:45Z","lastTransitionTime":"2025-10-01T12:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.839219 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.839483 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.839496 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.839518 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.839532 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:45Z","lastTransitionTime":"2025-10-01T12:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.946327 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.946378 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.946391 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.946408 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:45 crc kubenswrapper[4727]: I1001 12:37:45.946420 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:45Z","lastTransitionTime":"2025-10-01T12:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.049222 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.049283 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.049301 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.049325 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.049342 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:46Z","lastTransitionTime":"2025-10-01T12:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.152233 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.152298 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.152317 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.152343 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.152360 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:46Z","lastTransitionTime":"2025-10-01T12:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.255659 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.255717 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.255729 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.255747 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.255759 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:46Z","lastTransitionTime":"2025-10-01T12:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.357909 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.358051 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.358065 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.358083 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.358097 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:46Z","lastTransitionTime":"2025-10-01T12:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.372501 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:46 crc kubenswrapper[4727]: E1001 12:37:46.372714 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.460795 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.460854 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.460865 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.460880 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.460899 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:46Z","lastTransitionTime":"2025-10-01T12:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.564878 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.565117 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.565140 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.565169 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.565188 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:46Z","lastTransitionTime":"2025-10-01T12:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.668955 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.669070 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.669090 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.669117 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.669134 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:46Z","lastTransitionTime":"2025-10-01T12:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.772474 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.772532 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.772541 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.772561 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.772574 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:46Z","lastTransitionTime":"2025-10-01T12:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.875977 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.876038 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.876051 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.876068 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.876081 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:46Z","lastTransitionTime":"2025-10-01T12:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.978669 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.978757 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.978782 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.978814 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:46 crc kubenswrapper[4727]: I1001 12:37:46.978836 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:46Z","lastTransitionTime":"2025-10-01T12:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.082837 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.082906 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.082926 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.082955 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.082974 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:47Z","lastTransitionTime":"2025-10-01T12:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.185827 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.185899 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.185924 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.185953 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.185977 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:47Z","lastTransitionTime":"2025-10-01T12:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.289558 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.289625 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.289645 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.289673 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.289694 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:47Z","lastTransitionTime":"2025-10-01T12:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.371545 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.371605 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.371564 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:37:47 crc kubenswrapper[4727]: E1001 12:37:47.371837 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:37:47 crc kubenswrapper[4727]: E1001 12:37:47.371950 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:37:47 crc kubenswrapper[4727]: E1001 12:37:47.372042 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.393144 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.393198 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.393214 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.393238 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.393254 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:47Z","lastTransitionTime":"2025-10-01T12:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.451628 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7f4ab8d-5f57-47bd-93fc-9219c596c436-metrics-certs\") pod \"network-metrics-daemon-tvtzh\" (UID: \"f7f4ab8d-5f57-47bd-93fc-9219c596c436\") " pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:37:47 crc kubenswrapper[4727]: E1001 12:37:47.451935 4727 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:37:47 crc kubenswrapper[4727]: E1001 12:37:47.452076 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7f4ab8d-5f57-47bd-93fc-9219c596c436-metrics-certs podName:f7f4ab8d-5f57-47bd-93fc-9219c596c436 nodeName:}" failed. No retries permitted until 2025-10-01 12:37:55.452052092 +0000 UTC m=+53.773406969 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7f4ab8d-5f57-47bd-93fc-9219c596c436-metrics-certs") pod "network-metrics-daemon-tvtzh" (UID: "f7f4ab8d-5f57-47bd-93fc-9219c596c436") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.496322 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.496359 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.496367 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.496397 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.496410 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:47Z","lastTransitionTime":"2025-10-01T12:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.598798 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.598848 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.598860 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.598875 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.598886 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:47Z","lastTransitionTime":"2025-10-01T12:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.701816 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.701852 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.701864 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.701884 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.701896 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:47Z","lastTransitionTime":"2025-10-01T12:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.804810 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.804872 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.804887 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.804907 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.804921 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:47Z","lastTransitionTime":"2025-10-01T12:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.907927 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.907969 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.907979 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.907991 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:47 crc kubenswrapper[4727]: I1001 12:37:47.908013 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:47Z","lastTransitionTime":"2025-10-01T12:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.011438 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.011486 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.011497 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.011513 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.011527 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:48Z","lastTransitionTime":"2025-10-01T12:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.114987 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.115037 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.115047 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.115060 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.115069 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:48Z","lastTransitionTime":"2025-10-01T12:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.217907 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.217956 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.217968 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.217988 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.218019 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:48Z","lastTransitionTime":"2025-10-01T12:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.327239 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.327307 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.327320 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.327344 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.327360 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:48Z","lastTransitionTime":"2025-10-01T12:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.371342 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:48 crc kubenswrapper[4727]: E1001 12:37:48.371500 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.429970 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.430026 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.430038 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.430055 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.430067 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:48Z","lastTransitionTime":"2025-10-01T12:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.533288 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.533557 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.533658 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.533779 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.533888 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:48Z","lastTransitionTime":"2025-10-01T12:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.635832 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.635871 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.635881 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.635896 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.635932 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:48Z","lastTransitionTime":"2025-10-01T12:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.738935 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.738967 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.738976 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.738989 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.739020 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:48Z","lastTransitionTime":"2025-10-01T12:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.842335 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.842391 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.842402 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.842419 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.842429 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:48Z","lastTransitionTime":"2025-10-01T12:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.946256 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.946304 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.946327 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.946348 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:48 crc kubenswrapper[4727]: I1001 12:37:48.946362 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:48Z","lastTransitionTime":"2025-10-01T12:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.050202 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.050284 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.050301 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.050330 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.050355 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:49Z","lastTransitionTime":"2025-10-01T12:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.154276 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.154332 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.154344 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.154366 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.154380 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:49Z","lastTransitionTime":"2025-10-01T12:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.257895 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.257951 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.257961 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.257981 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.258020 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:49Z","lastTransitionTime":"2025-10-01T12:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.361278 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.361325 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.361339 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.361361 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.361378 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:49Z","lastTransitionTime":"2025-10-01T12:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.371477 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.371508 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.371578 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:37:49 crc kubenswrapper[4727]: E1001 12:37:49.371724 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:37:49 crc kubenswrapper[4727]: E1001 12:37:49.371897 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:37:49 crc kubenswrapper[4727]: E1001 12:37:49.372061 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.465091 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.465608 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.465619 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.465637 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.465665 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:49Z","lastTransitionTime":"2025-10-01T12:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.569673 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.569826 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.569849 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.569900 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.569920 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:49Z","lastTransitionTime":"2025-10-01T12:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.673677 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.673746 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.673765 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.673797 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.673818 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:49Z","lastTransitionTime":"2025-10-01T12:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.778247 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.778310 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.778328 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.778359 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.778383 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:49Z","lastTransitionTime":"2025-10-01T12:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.880402 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.880492 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.880513 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.880545 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.880564 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:49Z","lastTransitionTime":"2025-10-01T12:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:49 crc kubenswrapper[4727]: E1001 12:37:49.903627 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.910066 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.910144 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.910168 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.910196 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.910224 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:49Z","lastTransitionTime":"2025-10-01T12:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:49 crc kubenswrapper[4727]: E1001 12:37:49.933644 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.939822 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.939904 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.939940 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.939962 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.939978 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:49Z","lastTransitionTime":"2025-10-01T12:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:49 crc kubenswrapper[4727]: E1001 12:37:49.958675 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.967259 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.967351 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.967370 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.967396 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.967413 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:49Z","lastTransitionTime":"2025-10-01T12:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:49 crc kubenswrapper[4727]: E1001 12:37:49.981269 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.986290 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.986323 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.986358 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.986383 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:49 crc kubenswrapper[4727]: I1001 12:37:49.986394 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:49Z","lastTransitionTime":"2025-10-01T12:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:50 crc kubenswrapper[4727]: E1001 12:37:50.002432 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:50Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:50 crc kubenswrapper[4727]: E1001 12:37:50.002557 4727 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.005458 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.005509 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.005525 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.005544 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.005558 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:50Z","lastTransitionTime":"2025-10-01T12:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.108255 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.108325 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.108357 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.108387 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.108408 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:50Z","lastTransitionTime":"2025-10-01T12:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.211389 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.211476 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.211499 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.211530 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.211549 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:50Z","lastTransitionTime":"2025-10-01T12:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.314813 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.314890 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.314914 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.314947 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.314967 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:50Z","lastTransitionTime":"2025-10-01T12:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.371572 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:50 crc kubenswrapper[4727]: E1001 12:37:50.371803 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.417756 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.417810 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.417825 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.417849 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.417861 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:50Z","lastTransitionTime":"2025-10-01T12:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.521979 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.522037 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.522048 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.522070 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.522085 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:50Z","lastTransitionTime":"2025-10-01T12:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.626060 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.626138 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.626159 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.626189 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.626210 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:50Z","lastTransitionTime":"2025-10-01T12:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.729780 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.729845 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.729859 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.729880 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.729895 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:50Z","lastTransitionTime":"2025-10-01T12:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.832975 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.833050 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.833062 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.833082 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.833098 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:50Z","lastTransitionTime":"2025-10-01T12:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.936270 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.936361 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.936381 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.936411 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:50 crc kubenswrapper[4727]: I1001 12:37:50.936434 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:50Z","lastTransitionTime":"2025-10-01T12:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.039503 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.039573 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.039588 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.039609 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.039628 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:51Z","lastTransitionTime":"2025-10-01T12:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.143260 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.143339 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.143358 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.143388 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.143409 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:51Z","lastTransitionTime":"2025-10-01T12:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.247467 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.247529 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.247546 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.247573 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.247588 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:51Z","lastTransitionTime":"2025-10-01T12:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.350554 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.350623 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.350642 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.350664 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.350698 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:51Z","lastTransitionTime":"2025-10-01T12:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.368852 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.370338 4727 scope.go:117] "RemoveContainer" containerID="5dd79e28fd048f20b42f218511020816b360e3e2494f47613caa63852df78639" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.371420 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.371473 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:37:51 crc kubenswrapper[4727]: E1001 12:37:51.371545 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:37:51 crc kubenswrapper[4727]: E1001 12:37:51.371654 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.371709 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:37:51 crc kubenswrapper[4727]: E1001 12:37:51.371846 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.394978 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.419753 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.446331 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.455383 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.455494 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.455516 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.455578 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.455599 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:51Z","lastTransitionTime":"2025-10-01T12:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.471651 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5dd79e28fd048f20b42f218511020816b360e3e2494f47613caa63852df78639\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dd79e28fd048f20b42f218511020816b360e3e2494f47613caa63852df78639\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"message\\\":\\\"\\\\\\\"10.217.4.10\\\\\\\", Port:9154, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}, services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_UDP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"UDP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nI1001 12:37:37.918925 6163 services_controller.go:453] Built service openshift-dns/dns-default template LB for network=default: []services.LB{}\\\\nI1001 12:37:37.918941 6163 services_controller.go:454] Service openshift-dns/dns-default for network=default has 0 cluster-wide, 3 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nF1001 12:37:37.918966 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pwx55_openshift-ovn-kubernetes(a908511b-2ce2-4a11-8dad-3867bee13f57)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.490656 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.506202 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.521072 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.537190 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.549392 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.558383 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.558428 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.558444 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.558463 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.558480 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:51Z","lastTransitionTime":"2025-10-01T12:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.574836 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b50096decee04773ae4447bce8059d65900e8d0b71b7bbca98419098bcea04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.588941 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.604403 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.620450 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.635464 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed486c6-587b-40ec-a908-064c3623b893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f28ab4deda37f2d065260409ffad7f3fd032a10ba6559420d948b94f0549e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1cabea4e68fe88d8cd24753367f3f9d696c0d6f8afd244ae6f4e1d3890d856a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gfkfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.650611 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tvtzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7f4ab8d-5f57-47bd-93fc-9219c596c436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tvtzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.661659 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.661744 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.661764 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.661793 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.661814 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:51Z","lastTransitionTime":"2025-10-01T12:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.674768 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.690290 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.736168 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwx55_a908511b-2ce2-4a11-8dad-3867bee13f57/ovnkube-controller/1.log" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.739561 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" event={"ID":"a908511b-2ce2-4a11-8dad-3867bee13f57","Type":"ContainerStarted","Data":"6187d80beb2c990275023fbaee35fd393ec4d863261e1626a11c59be69c9d327"} Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.741024 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.764662 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.764698 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.764710 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.764728 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.764739 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:51Z","lastTransitionTime":"2025-10-01T12:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.766741 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.785842 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.804628 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.821555 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed486c6-587b-40ec-a908-064c3623b893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f28ab4deda37f2d065260409ffad7f3fd032a10ba6559420d948b94f0549e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1cabea4e68fe88d8cd24753367f3f9d696c0d6f8afd244ae6f4e1d3890d856a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gfkfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.841864 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tvtzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7f4ab8d-5f57-47bd-93fc-9219c596c436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tvtzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.867625 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.867680 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.867692 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.867709 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.867722 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:51Z","lastTransitionTime":"2025-10-01T12:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.867851 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.888921 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.914901 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.943053 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.966218 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6187d80beb2c990275023fbaee35fd393ec4d863261e1626a11c59be69c9d327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dd79e28fd048f20b42f218511020816b360e3e2494f47613caa63852df78639\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"message\\\":\\\"\\\\\\\"10.217.4.10\\\\\\\", Port:9154, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}, services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_UDP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"UDP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nI1001 12:37:37.918925 6163 services_controller.go:453] Built service openshift-dns/dns-default template LB for network=default: []services.LB{}\\\\nI1001 12:37:37.918941 6163 services_controller.go:454] Service openshift-dns/dns-default for network=default has 0 cluster-wide, 3 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nF1001 12:37:37.918966 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.969924 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.969965 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.969977 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.970009 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.970052 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:51Z","lastTransitionTime":"2025-10-01T12:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.980939 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:51 crc kubenswrapper[4727]: I1001 12:37:51.995918 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:51Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.010305 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.025045 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.039699 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.055118 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.073126 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.073193 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.073212 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.073240 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.073270 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:52Z","lastTransitionTime":"2025-10-01T12:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.076424 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b50096decee04773ae4447bce8059d65900e8d0b71b7bbca98419098bcea04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.175917 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.175988 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.176034 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.176063 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.176083 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:52Z","lastTransitionTime":"2025-10-01T12:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.278732 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.278776 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.278789 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.278805 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.278821 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:52Z","lastTransitionTime":"2025-10-01T12:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.372330 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:52 crc kubenswrapper[4727]: E1001 12:37:52.372575 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.382086 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.382128 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.382140 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.382159 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.382173 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:52Z","lastTransitionTime":"2025-10-01T12:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.389766 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.407374 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.425858 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.450187 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6187d80beb2c990275023fbaee35fd393ec4d863261e1626a11c59be69c9d327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dd79e28fd048f20b42f218511020816b360e3e2494f47613caa63852df78639\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"message\\\":\\\"\\\\\\\"10.217.4.10\\\\\\\", Port:9154, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}, services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_UDP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"UDP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nI1001 12:37:37.918925 6163 services_controller.go:453] Built service openshift-dns/dns-default template LB for network=default: []services.LB{}\\\\nI1001 12:37:37.918941 6163 services_controller.go:454] Service openshift-dns/dns-default for network=default has 0 cluster-wide, 3 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nF1001 12:37:37.918966 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.473345 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.484468 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.484506 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.484517 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.484536 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.484547 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:52Z","lastTransitionTime":"2025-10-01T12:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.487312 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.502863 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.515457 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.527355 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.541780 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b50096decee04773ae4447bce8059d65900e8d0b71b7bbca98419098bcea04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.552973 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.563386 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.579804 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.586414 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.586452 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.586501 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.586517 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.586527 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:52Z","lastTransitionTime":"2025-10-01T12:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.593752 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed486c6-587b-40ec-a908-064c3623b893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f28ab4deda37f2d065260409ffad7f3fd032a10ba6559420d948b94f0549e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1cabea4e68fe88d8cd24753367f3f9d696c0d6f8afd244ae6f4e1d3890d856a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gfkfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.607604 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tvtzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7f4ab8d-5f57-47bd-93fc-9219c596c436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tvtzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.628452 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.642171 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.689779 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.689826 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.689837 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.689854 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.689865 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:52Z","lastTransitionTime":"2025-10-01T12:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.745784 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwx55_a908511b-2ce2-4a11-8dad-3867bee13f57/ovnkube-controller/2.log" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.746403 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwx55_a908511b-2ce2-4a11-8dad-3867bee13f57/ovnkube-controller/1.log" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.749238 4727 generic.go:334] "Generic (PLEG): container finished" podID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerID="6187d80beb2c990275023fbaee35fd393ec4d863261e1626a11c59be69c9d327" exitCode=1 Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.749297 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" event={"ID":"a908511b-2ce2-4a11-8dad-3867bee13f57","Type":"ContainerDied","Data":"6187d80beb2c990275023fbaee35fd393ec4d863261e1626a11c59be69c9d327"} Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.749351 4727 scope.go:117] "RemoveContainer" containerID="5dd79e28fd048f20b42f218511020816b360e3e2494f47613caa63852df78639" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.750299 4727 scope.go:117] "RemoveContainer" containerID="6187d80beb2c990275023fbaee35fd393ec4d863261e1626a11c59be69c9d327" Oct 01 12:37:52 crc kubenswrapper[4727]: E1001 12:37:52.750624 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pwx55_openshift-ovn-kubernetes(a908511b-2ce2-4a11-8dad-3867bee13f57)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.767853 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.784753 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.793543 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.793577 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.793589 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.793610 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.793622 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:52Z","lastTransitionTime":"2025-10-01T12:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.801271 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.818028 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.831623 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.846628 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.867388 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b50096decee04773ae4447bce8059d65900e8d0b71b7bbca98419098bcea04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.884577 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tvtzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7f4ab8d-5f57-47bd-93fc-9219c596c436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tvtzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.896944 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.897034 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.897049 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.897073 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.897089 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:52Z","lastTransitionTime":"2025-10-01T12:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.907615 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.927454 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.945140 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.966213 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed486c6-587b-40ec-a908-064c3623b893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f28ab4deda37f2d065260409ffad7f3fd032a10ba6559420d948b94f0549e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1cabea4e68fe88d8cd24753367f3f9d696c0d6f8afd244ae6f4e1d3890d856a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gfkfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:52 crc kubenswrapper[4727]: I1001 12:37:52.988667 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6187d80beb2c990275023fbaee35fd393ec4d863261e1626a11c59be69c9d327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dd79e28fd048f20b42f218511020816b360e3e2494f47613caa63852df78639\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"message\\\":\\\"\\\\\\\"10.217.4.10\\\\\\\", Port:9154, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}, services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_UDP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"UDP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nI1001 12:37:37.918925 6163 services_controller.go:453] Built service openshift-dns/dns-default template LB for network=default: []services.LB{}\\\\nI1001 12:37:37.918941 6163 services_controller.go:454] Service openshift-dns/dns-default for network=default has 0 cluster-wide, 3 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nF1001 12:37:37.918966 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6187d80beb2c990275023fbaee35fd393ec4d863261e1626a11c59be69c9d327\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:37:52Z\\\",\\\"message\\\":\\\"37 6361 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-b9wkt in node crc\\\\nF1001 12:37:52.304941 6361 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z]\\\\nI1001 12:37:52.304727 6361 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:52.999844 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:52.999902 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:52.999918 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:52.999940 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:52.999955 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:52Z","lastTransitionTime":"2025-10-01T12:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.007399 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.024397 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.042970 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.062298 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.102590 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.102909 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.102988 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.103106 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.103183 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:53Z","lastTransitionTime":"2025-10-01T12:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.208128 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.209011 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.209052 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.209086 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.209101 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:53Z","lastTransitionTime":"2025-10-01T12:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.218898 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.232475 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.244655 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.257537 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.268061 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.281556 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b50096decee04773ae4447bce8059d65900e8d0b71b7bbca98419098bcea04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.291834 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tvtzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7f4ab8d-5f57-47bd-93fc-9219c596c436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tvtzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.312941 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.313011 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.313029 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.313050 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.313064 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:53Z","lastTransitionTime":"2025-10-01T12:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.314214 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.328236 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.345141 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.355859 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed486c6-587b-40ec-a908-064c3623b893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f28ab4deda37f2d065260409ffad7f3fd032a10ba6559420d948b94f0549e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1cabea4e68fe88d8cd24753367f3f9d696c0d6f8afd244ae6f4e1d3890d856a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gfkfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.371940 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:37:53 crc kubenswrapper[4727]: E1001 12:37:53.372112 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.372220 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:37:53 crc kubenswrapper[4727]: E1001 12:37:53.372320 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.371959 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:37:53 crc kubenswrapper[4727]: E1001 12:37:53.372649 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.373725 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6187d80beb2c990275023fbaee35fd393ec4d863261e1626a11c59be69c9d327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5dd79e28fd048f20b42f218511020816b360e3e2494f47613caa63852df78639\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"message\\\":\\\"\\\\\\\"10.217.4.10\\\\\\\", Port:9154, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}, services.LB{Name:\\\\\\\"Service_openshift-dns/dns-default_UDP_node_router+switch_crc\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"UDP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.10\\\\\\\", Port:53, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nI1001 12:37:37.918925 6163 services_controller.go:453] Built service openshift-dns/dns-default template LB for network=default: []services.LB{}\\\\nI1001 12:37:37.918941 6163 services_controller.go:454] Service openshift-dns/dns-default for network=default has 0 cluster-wide, 3 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nF1001 12:37:37.918966 6163 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6187d80beb2c990275023fbaee35fd393ec4d863261e1626a11c59be69c9d327\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:37:52Z\\\",\\\"message\\\":\\\"37 6361 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-b9wkt in node crc\\\\nF1001 12:37:52.304941 6361 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z]\\\\nI1001 12:37:52.304727 6361 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.386866 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.400143 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.411608 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.416347 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.416719 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.416810 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.416897 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.416974 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:53Z","lastTransitionTime":"2025-10-01T12:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.425047 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.439170 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.454158 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.467024 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.519948 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.520012 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.520026 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.520044 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.520056 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:53Z","lastTransitionTime":"2025-10-01T12:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.622764 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.622838 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.622860 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.622890 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.622913 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:53Z","lastTransitionTime":"2025-10-01T12:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.727269 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.727323 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.727335 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.727356 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.727367 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:53Z","lastTransitionTime":"2025-10-01T12:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.754703 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwx55_a908511b-2ce2-4a11-8dad-3867bee13f57/ovnkube-controller/2.log" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.759284 4727 scope.go:117] "RemoveContainer" containerID="6187d80beb2c990275023fbaee35fd393ec4d863261e1626a11c59be69c9d327" Oct 01 12:37:53 crc kubenswrapper[4727]: E1001 12:37:53.759577 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pwx55_openshift-ovn-kubernetes(a908511b-2ce2-4a11-8dad-3867bee13f57)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.772898 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.785880 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.798971 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.815485 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b50096decee04773ae4447bce8059d65900e8d0b71b7bbca98419098bcea04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.831607 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tvtzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7f4ab8d-5f57-47bd-93fc-9219c596c436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tvtzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.832027 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.832085 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.832102 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.832153 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.832179 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:53Z","lastTransitionTime":"2025-10-01T12:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.859723 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.874955 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.894483 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.915061 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed486c6-587b-40ec-a908-064c3623b893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f28ab4deda37f2d065260409ffad7f3fd032a10ba6559420d948b94f0549e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1cabea4e68fe88d8cd24753367f3f9d696c0d6f8afd244ae6f4e1d3890d856a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gfkfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.935351 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6187d80beb2c990275023fbaee35fd393ec4d863261e1626a11c59be69c9d327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6187d80beb2c990275023fbaee35fd393ec4d863261e1626a11c59be69c9d327\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:37:52Z\\\",\\\"message\\\":\\\"37 6361 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-b9wkt in node crc\\\\nF1001 12:37:52.304941 6361 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z]\\\\nI1001 12:37:52.304727 6361 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pwx55_openshift-ovn-kubernetes(a908511b-2ce2-4a11-8dad-3867bee13f57)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.935594 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.935629 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.935645 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.935665 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.935677 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:53Z","lastTransitionTime":"2025-10-01T12:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.951691 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.968571 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bad51494-b8d2-4c83-b154-bdcb47072d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f792b3289e210881d451962f8c2fd7f66ba8e01540309210e4286af5c14056c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac7ffe118814edb3f763dce5c8d5adee0faab3a74f38abb06f39d0ffb91dea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40a5eee022677df9faef1fa90bae6dd0987ead513c125425b2aab5c5e635e47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5801d95e434a56f2ba9e6f26b212681adb8be1b6b4d046992ccb6461edd5434c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5801d95e434a56f2ba9e6f26b212681adb8be1b6b4d046992ccb6461edd5434c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.981540 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:53 crc kubenswrapper[4727]: I1001 12:37:53.994541 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:53Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.010436 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:54Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.030072 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:54Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.038373 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.038450 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.038479 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.038517 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.038542 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:54Z","lastTransitionTime":"2025-10-01T12:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.046121 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:54Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.058528 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:54Z is after 2025-08-24T17:21:41Z" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.141391 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.141492 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.141521 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.141560 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.141595 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:54Z","lastTransitionTime":"2025-10-01T12:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.245333 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.245397 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.245411 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.245436 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.245455 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:54Z","lastTransitionTime":"2025-10-01T12:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.348945 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.349008 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.349023 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.349044 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.349056 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:54Z","lastTransitionTime":"2025-10-01T12:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.371645 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:54 crc kubenswrapper[4727]: E1001 12:37:54.371857 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.453065 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.453118 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.453129 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.453151 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.453165 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:54Z","lastTransitionTime":"2025-10-01T12:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.556512 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.556565 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.556575 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.556593 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.556604 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:54Z","lastTransitionTime":"2025-10-01T12:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.659763 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.659812 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.659824 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.659844 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.659857 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:54Z","lastTransitionTime":"2025-10-01T12:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.762242 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.762302 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.762314 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.762335 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.762347 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:54Z","lastTransitionTime":"2025-10-01T12:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.866356 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.866802 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.866969 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.867241 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.867412 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:54Z","lastTransitionTime":"2025-10-01T12:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.970456 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.970497 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.970507 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.970524 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:54 crc kubenswrapper[4727]: I1001 12:37:54.970537 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:54Z","lastTransitionTime":"2025-10-01T12:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.072970 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.073067 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.073084 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.073110 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.073128 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:55Z","lastTransitionTime":"2025-10-01T12:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.135401 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:37:55 crc kubenswrapper[4727]: E1001 12:37:55.135567 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:38:27.135535505 +0000 UTC m=+85.456890372 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.176703 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.176766 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.176781 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.176803 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.176820 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:55Z","lastTransitionTime":"2025-10-01T12:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.236620 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.236694 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.236749 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.236807 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:55 crc kubenswrapper[4727]: E1001 12:37:55.236972 4727 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:37:55 crc kubenswrapper[4727]: E1001 12:37:55.237092 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:27.237066965 +0000 UTC m=+85.558421832 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:37:55 crc kubenswrapper[4727]: E1001 12:37:55.237085 4727 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:37:55 crc kubenswrapper[4727]: E1001 12:37:55.237117 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:37:55 crc kubenswrapper[4727]: E1001 12:37:55.237296 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:27.23721787 +0000 UTC m=+85.558572747 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:37:55 crc kubenswrapper[4727]: E1001 12:37:55.237323 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:37:55 crc kubenswrapper[4727]: E1001 12:37:55.237351 4727 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:55 crc kubenswrapper[4727]: E1001 12:37:55.237121 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:37:55 crc kubenswrapper[4727]: E1001 12:37:55.237464 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:37:55 crc kubenswrapper[4727]: E1001 12:37:55.237493 4727 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:55 crc kubenswrapper[4727]: E1001 12:37:55.237413 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:27.237396295 +0000 UTC m=+85.558751332 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:55 crc kubenswrapper[4727]: E1001 12:37:55.237576 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:27.237544459 +0000 UTC m=+85.558899426 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.280292 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.280359 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.280385 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.280420 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.280444 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:55Z","lastTransitionTime":"2025-10-01T12:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.372119 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.372135 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.372119 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:37:55 crc kubenswrapper[4727]: E1001 12:37:55.372302 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:37:55 crc kubenswrapper[4727]: E1001 12:37:55.372473 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:37:55 crc kubenswrapper[4727]: E1001 12:37:55.372619 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.383640 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.383697 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.383715 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.383744 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.383769 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:55Z","lastTransitionTime":"2025-10-01T12:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.487478 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.487547 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.487567 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.487596 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.487618 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:55Z","lastTransitionTime":"2025-10-01T12:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.541125 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7f4ab8d-5f57-47bd-93fc-9219c596c436-metrics-certs\") pod \"network-metrics-daemon-tvtzh\" (UID: \"f7f4ab8d-5f57-47bd-93fc-9219c596c436\") " pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:37:55 crc kubenswrapper[4727]: E1001 12:37:55.541414 4727 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:37:55 crc kubenswrapper[4727]: E1001 12:37:55.541901 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7f4ab8d-5f57-47bd-93fc-9219c596c436-metrics-certs podName:f7f4ab8d-5f57-47bd-93fc-9219c596c436 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:11.541477408 +0000 UTC m=+69.862832275 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7f4ab8d-5f57-47bd-93fc-9219c596c436-metrics-certs") pod "network-metrics-daemon-tvtzh" (UID: "f7f4ab8d-5f57-47bd-93fc-9219c596c436") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.591152 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.591200 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.591210 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.591225 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.591235 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:55Z","lastTransitionTime":"2025-10-01T12:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.694059 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.694118 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.694131 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.694147 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.694158 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:55Z","lastTransitionTime":"2025-10-01T12:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.797434 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.798015 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.798030 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.798053 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.798070 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:55Z","lastTransitionTime":"2025-10-01T12:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.901922 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.901980 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.902025 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.902046 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:55 crc kubenswrapper[4727]: I1001 12:37:55.902061 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:55Z","lastTransitionTime":"2025-10-01T12:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.005186 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.005268 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.005285 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.005311 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.005328 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:56Z","lastTransitionTime":"2025-10-01T12:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.108479 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.108538 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.108552 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.108574 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.108589 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:56Z","lastTransitionTime":"2025-10-01T12:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.211775 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.211819 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.211831 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.211849 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.211861 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:56Z","lastTransitionTime":"2025-10-01T12:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.314932 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.314993 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.315024 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.315044 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.315057 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:56Z","lastTransitionTime":"2025-10-01T12:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.372236 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:56 crc kubenswrapper[4727]: E1001 12:37:56.372403 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.418523 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.418572 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.418584 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.418602 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.418615 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:56Z","lastTransitionTime":"2025-10-01T12:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.521515 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.521591 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.521604 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.521624 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.521636 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:56Z","lastTransitionTime":"2025-10-01T12:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.625096 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.625144 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.625159 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.625182 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.625198 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:56Z","lastTransitionTime":"2025-10-01T12:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.728517 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.728564 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.728575 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.728592 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.728604 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:56Z","lastTransitionTime":"2025-10-01T12:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.832306 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.832367 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.832377 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.832398 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.832412 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:56Z","lastTransitionTime":"2025-10-01T12:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.934912 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.934970 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.934983 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.935027 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:56 crc kubenswrapper[4727]: I1001 12:37:56.935046 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:56Z","lastTransitionTime":"2025-10-01T12:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.037828 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.037868 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.037880 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.037897 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.037908 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:57Z","lastTransitionTime":"2025-10-01T12:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.140706 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.140748 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.140800 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.140822 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.140837 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:57Z","lastTransitionTime":"2025-10-01T12:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.244677 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.244727 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.244737 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.244754 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.244766 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:57Z","lastTransitionTime":"2025-10-01T12:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.347031 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.347640 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.347728 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.347801 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.348091 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:57Z","lastTransitionTime":"2025-10-01T12:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.371740 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.371750 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:37:57 crc kubenswrapper[4727]: E1001 12:37:57.371924 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:37:57 crc kubenswrapper[4727]: E1001 12:37:57.372029 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.371750 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:37:57 crc kubenswrapper[4727]: E1001 12:37:57.372114 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.451248 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.451648 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.451798 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.451946 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.452132 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:57Z","lastTransitionTime":"2025-10-01T12:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.555431 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.555918 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.556035 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.556132 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.556198 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:57Z","lastTransitionTime":"2025-10-01T12:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.658780 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.658818 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.658827 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.658846 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.658858 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:57Z","lastTransitionTime":"2025-10-01T12:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.761485 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.761546 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.761563 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.761588 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.761606 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:57Z","lastTransitionTime":"2025-10-01T12:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.864223 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.864301 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.864324 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.864355 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.864378 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:57Z","lastTransitionTime":"2025-10-01T12:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.968526 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.968591 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.968615 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.968644 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:57 crc kubenswrapper[4727]: I1001 12:37:57.968675 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:57Z","lastTransitionTime":"2025-10-01T12:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.071480 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.071542 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.071559 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.071624 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.071644 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:58Z","lastTransitionTime":"2025-10-01T12:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.174903 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.174970 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.174988 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.175047 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.175073 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:58Z","lastTransitionTime":"2025-10-01T12:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.277534 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.277589 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.277601 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.277617 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.277630 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:58Z","lastTransitionTime":"2025-10-01T12:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.372511 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:37:58 crc kubenswrapper[4727]: E1001 12:37:58.372704 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.379422 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.379476 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.379488 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.379508 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.379520 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:58Z","lastTransitionTime":"2025-10-01T12:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.483099 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.483171 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.483183 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.483207 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.483219 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:58Z","lastTransitionTime":"2025-10-01T12:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.585904 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.585982 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.586036 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.586067 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.586086 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:58Z","lastTransitionTime":"2025-10-01T12:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.689252 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.689307 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.689333 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.689357 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.689370 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:58Z","lastTransitionTime":"2025-10-01T12:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.791718 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.791781 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.791796 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.791816 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.791829 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:58Z","lastTransitionTime":"2025-10-01T12:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.895357 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.895419 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.895431 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.895452 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.895477 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:58Z","lastTransitionTime":"2025-10-01T12:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.998755 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.998811 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.998822 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.998843 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:58 crc kubenswrapper[4727]: I1001 12:37:58.998857 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:58Z","lastTransitionTime":"2025-10-01T12:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.101342 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.101404 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.101415 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.101435 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.101449 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:59Z","lastTransitionTime":"2025-10-01T12:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.203976 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.204057 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.204070 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.204093 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.204107 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:59Z","lastTransitionTime":"2025-10-01T12:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.307145 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.307207 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.307219 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.307237 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.307251 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:59Z","lastTransitionTime":"2025-10-01T12:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.372321 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.372449 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:37:59 crc kubenswrapper[4727]: E1001 12:37:59.372508 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.372552 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:37:59 crc kubenswrapper[4727]: E1001 12:37:59.372656 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:37:59 crc kubenswrapper[4727]: E1001 12:37:59.372744 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.409862 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.409932 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.409943 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.409963 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.409976 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:59Z","lastTransitionTime":"2025-10-01T12:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.513594 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.513678 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.513699 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.513725 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.513744 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:59Z","lastTransitionTime":"2025-10-01T12:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.616403 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.616484 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.616497 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.616523 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.616545 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:59Z","lastTransitionTime":"2025-10-01T12:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.720248 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.720299 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.720313 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.720356 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.720371 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:59Z","lastTransitionTime":"2025-10-01T12:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.823284 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.823332 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.823344 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.823362 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.823391 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:59Z","lastTransitionTime":"2025-10-01T12:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.927227 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.927283 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.927297 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.927317 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:37:59 crc kubenswrapper[4727]: I1001 12:37:59.927331 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:37:59Z","lastTransitionTime":"2025-10-01T12:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.031130 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.031182 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.031194 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.031212 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.031225 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:00Z","lastTransitionTime":"2025-10-01T12:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.135706 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.135763 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.135779 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.135796 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.135810 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:00Z","lastTransitionTime":"2025-10-01T12:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.202421 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.202466 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.202475 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.202491 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.202500 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:00Z","lastTransitionTime":"2025-10-01T12:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:00 crc kubenswrapper[4727]: E1001 12:38:00.217659 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:00Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.223222 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.223351 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.223374 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.223404 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.223421 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:00Z","lastTransitionTime":"2025-10-01T12:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:00 crc kubenswrapper[4727]: E1001 12:38:00.242083 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:00Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.250647 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.250700 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.250712 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.250730 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.250744 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:00Z","lastTransitionTime":"2025-10-01T12:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:00 crc kubenswrapper[4727]: E1001 12:38:00.264553 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:00Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.269777 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.269845 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.269860 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.269881 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.269894 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:00Z","lastTransitionTime":"2025-10-01T12:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:00 crc kubenswrapper[4727]: E1001 12:38:00.286506 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:00Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.290820 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.290864 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.290875 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.290895 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.290907 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:00Z","lastTransitionTime":"2025-10-01T12:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:00 crc kubenswrapper[4727]: E1001 12:38:00.312342 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:00Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:00 crc kubenswrapper[4727]: E1001 12:38:00.312489 4727 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.314576 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.314632 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.314644 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.314663 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.314678 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:00Z","lastTransitionTime":"2025-10-01T12:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.371501 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:00 crc kubenswrapper[4727]: E1001 12:38:00.371661 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.417417 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.417467 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.417479 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.417496 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.417505 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:00Z","lastTransitionTime":"2025-10-01T12:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.520415 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.520495 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.520512 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.520539 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.520555 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:00Z","lastTransitionTime":"2025-10-01T12:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.623735 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.623783 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.623796 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.623814 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.623826 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:00Z","lastTransitionTime":"2025-10-01T12:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.726585 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.726653 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.726668 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.726692 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.726708 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:00Z","lastTransitionTime":"2025-10-01T12:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.829684 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.829748 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.829763 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.829786 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.829807 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:00Z","lastTransitionTime":"2025-10-01T12:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.933663 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.933723 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.933735 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.933755 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:00 crc kubenswrapper[4727]: I1001 12:38:00.933768 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:00Z","lastTransitionTime":"2025-10-01T12:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.036940 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.037077 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.037108 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.037145 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.037170 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:01Z","lastTransitionTime":"2025-10-01T12:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.140553 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.140631 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.140660 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.140692 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.140708 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:01Z","lastTransitionTime":"2025-10-01T12:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.244120 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.244169 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.244184 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.244205 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.244221 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:01Z","lastTransitionTime":"2025-10-01T12:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.348020 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.348073 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.348084 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.348105 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.348119 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:01Z","lastTransitionTime":"2025-10-01T12:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.371917 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.372022 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.372029 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:01 crc kubenswrapper[4727]: E1001 12:38:01.372151 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:01 crc kubenswrapper[4727]: E1001 12:38:01.372255 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:01 crc kubenswrapper[4727]: E1001 12:38:01.372320 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.451520 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.451611 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.451682 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.451725 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.451752 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:01Z","lastTransitionTime":"2025-10-01T12:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.555287 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.555371 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.555391 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.555417 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.555434 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:01Z","lastTransitionTime":"2025-10-01T12:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.657965 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.658068 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.658092 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.658121 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.658148 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:01Z","lastTransitionTime":"2025-10-01T12:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.760732 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.760800 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.760818 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.760849 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.760870 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:01Z","lastTransitionTime":"2025-10-01T12:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.863738 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.863788 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.863798 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.863837 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.863848 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:01Z","lastTransitionTime":"2025-10-01T12:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.966291 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.966341 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.966352 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.966375 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:01 crc kubenswrapper[4727]: I1001 12:38:01.966391 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:01Z","lastTransitionTime":"2025-10-01T12:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.069526 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.069592 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.069614 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.069646 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.069666 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:02Z","lastTransitionTime":"2025-10-01T12:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.172915 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.173022 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.173047 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.173072 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.173086 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:02Z","lastTransitionTime":"2025-10-01T12:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.275487 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.275571 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.275586 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.275610 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.275625 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:02Z","lastTransitionTime":"2025-10-01T12:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.371404 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:02 crc kubenswrapper[4727]: E1001 12:38:02.371610 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.377861 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.377921 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.377933 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.377952 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.377966 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:02Z","lastTransitionTime":"2025-10-01T12:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.389311 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.405167 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.421617 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed486c6-587b-40ec-a908-064c3623b893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f28ab4deda37f2d065260409ffad7f3fd032a10ba6559420d948b94f0549e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1cabea4e68fe88d8cd24753367f3f9d696c0d6f8afd244ae6f4e1d3890d856a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gfkfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.435399 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tvtzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7f4ab8d-5f57-47bd-93fc-9219c596c436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tvtzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.473075 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.480433 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.480494 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.480510 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.480533 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.480553 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:02Z","lastTransitionTime":"2025-10-01T12:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.489694 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bad51494-b8d2-4c83-b154-bdcb47072d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f792b3289e210881d451962f8c2fd7f66ba8e01540309210e4286af5c14056c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac7ffe118814edb3f763dce5c8d5adee0faab3a74f38abb06f39d0ffb91dea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40a5eee022677df9faef1fa90bae6dd0987ead513c125425b2aab5c5e635e47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5801d95e434a56f2ba9e6f26b212681adb8be1b6b4d046992ccb6461edd5434c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5801d95e434a56f2ba9e6f26b212681adb8be1b6b4d046992ccb6461edd5434c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.508755 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.528592 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.544553 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.565126 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6187d80beb2c990275023fbaee35fd393ec4d863261e1626a11c59be69c9d327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6187d80beb2c990275023fbaee35fd393ec4d863261e1626a11c59be69c9d327\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:37:52Z\\\",\\\"message\\\":\\\"37 6361 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-b9wkt in node crc\\\\nF1001 12:37:52.304941 6361 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z]\\\\nI1001 12:37:52.304727 6361 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pwx55_openshift-ovn-kubernetes(a908511b-2ce2-4a11-8dad-3867bee13f57)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.581156 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.586338 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.586398 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.586409 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.586426 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.586439 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:02Z","lastTransitionTime":"2025-10-01T12:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.601763 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.617647 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.633769 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.649305 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.663511 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.679966 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b50096decee04773ae4447bce8059d65900e8d0b71b7bbca98419098bcea04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.689069 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.689113 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.689123 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.689139 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.689150 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:02Z","lastTransitionTime":"2025-10-01T12:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.697047 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:02Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.791427 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.791459 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.791483 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.791498 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.791508 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:02Z","lastTransitionTime":"2025-10-01T12:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.894570 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.894610 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.894621 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.894637 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.894648 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:02Z","lastTransitionTime":"2025-10-01T12:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.997943 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.998042 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.998062 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.998088 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:02 crc kubenswrapper[4727]: I1001 12:38:02.998107 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:02Z","lastTransitionTime":"2025-10-01T12:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.100254 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.100320 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.100336 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.100354 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.100364 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:03Z","lastTransitionTime":"2025-10-01T12:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.203218 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.203275 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.203287 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.203303 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.203320 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:03Z","lastTransitionTime":"2025-10-01T12:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.306800 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.307118 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.307276 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.307455 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.307696 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:03Z","lastTransitionTime":"2025-10-01T12:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.371828 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.372381 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:03 crc kubenswrapper[4727]: E1001 12:38:03.372638 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.372670 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:03 crc kubenswrapper[4727]: E1001 12:38:03.373028 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:03 crc kubenswrapper[4727]: E1001 12:38:03.373363 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.411977 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.412081 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.412099 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.412129 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.412148 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:03Z","lastTransitionTime":"2025-10-01T12:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.515477 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.515537 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.515555 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.515583 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.515601 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:03Z","lastTransitionTime":"2025-10-01T12:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.619398 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.619461 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.619482 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.619508 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.619527 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:03Z","lastTransitionTime":"2025-10-01T12:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.722117 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.722165 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.722179 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.722201 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.722217 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:03Z","lastTransitionTime":"2025-10-01T12:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.824671 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.824740 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.824754 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.824777 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.824792 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:03Z","lastTransitionTime":"2025-10-01T12:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.927627 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.927696 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.927706 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.927727 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:03 crc kubenswrapper[4727]: I1001 12:38:03.927746 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:03Z","lastTransitionTime":"2025-10-01T12:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.030909 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.030970 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.030992 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.031054 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.031075 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:04Z","lastTransitionTime":"2025-10-01T12:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.135201 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.135617 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.135762 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.135911 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.136088 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:04Z","lastTransitionTime":"2025-10-01T12:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.239436 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.239864 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.240115 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.240268 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.240438 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:04Z","lastTransitionTime":"2025-10-01T12:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.343098 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.343154 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.343172 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.343195 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.343212 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:04Z","lastTransitionTime":"2025-10-01T12:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.371543 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:04 crc kubenswrapper[4727]: E1001 12:38:04.371701 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.445948 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.446317 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.446455 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.446563 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.446658 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:04Z","lastTransitionTime":"2025-10-01T12:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.549755 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.550523 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.550664 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.550767 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.550863 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:04Z","lastTransitionTime":"2025-10-01T12:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.653831 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.653889 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.653904 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.653925 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.653939 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:04Z","lastTransitionTime":"2025-10-01T12:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.757012 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.757051 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.757063 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.757081 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.757093 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:04Z","lastTransitionTime":"2025-10-01T12:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.859922 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.860271 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.860367 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.860452 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.860537 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:04Z","lastTransitionTime":"2025-10-01T12:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.963404 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.963488 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.963516 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.963551 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:04 crc kubenswrapper[4727]: I1001 12:38:04.963575 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:04Z","lastTransitionTime":"2025-10-01T12:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.066744 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.066806 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.066824 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.066847 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.066888 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:05Z","lastTransitionTime":"2025-10-01T12:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.170497 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.171067 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.171324 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.171539 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.171723 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:05Z","lastTransitionTime":"2025-10-01T12:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.275166 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.275492 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.275583 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.275684 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.275767 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:05Z","lastTransitionTime":"2025-10-01T12:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.371963 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.372066 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:05 crc kubenswrapper[4727]: E1001 12:38:05.372206 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.371963 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:05 crc kubenswrapper[4727]: E1001 12:38:05.372356 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:05 crc kubenswrapper[4727]: E1001 12:38:05.372472 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.380390 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.380446 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.380463 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.380487 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.380506 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:05Z","lastTransitionTime":"2025-10-01T12:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.484392 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.484452 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.484471 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.484494 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.484516 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:05Z","lastTransitionTime":"2025-10-01T12:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.587759 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.587809 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.587822 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.587846 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.587859 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:05Z","lastTransitionTime":"2025-10-01T12:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.690950 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.691011 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.691021 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.691037 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.691050 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:05Z","lastTransitionTime":"2025-10-01T12:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.794635 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.794688 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.794702 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.794718 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.794733 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:05Z","lastTransitionTime":"2025-10-01T12:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.898604 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.899123 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.899239 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.899351 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:05 crc kubenswrapper[4727]: I1001 12:38:05.899452 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:05Z","lastTransitionTime":"2025-10-01T12:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.002104 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.002163 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.002181 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.002206 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.002225 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:06Z","lastTransitionTime":"2025-10-01T12:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.105345 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.105402 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.105412 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.105428 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.105439 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:06Z","lastTransitionTime":"2025-10-01T12:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.208580 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.208652 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.208672 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.208699 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.208721 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:06Z","lastTransitionTime":"2025-10-01T12:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.310889 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.310930 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.310941 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.310960 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.310974 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:06Z","lastTransitionTime":"2025-10-01T12:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.371380 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:06 crc kubenswrapper[4727]: E1001 12:38:06.371566 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.414834 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.415202 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.415290 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.415385 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.415484 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:06Z","lastTransitionTime":"2025-10-01T12:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.518714 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.518779 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.518789 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.518806 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.518817 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:06Z","lastTransitionTime":"2025-10-01T12:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.621459 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.622451 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.622657 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.623094 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.623347 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:06Z","lastTransitionTime":"2025-10-01T12:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.726995 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.727063 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.727079 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.727095 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.727107 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:06Z","lastTransitionTime":"2025-10-01T12:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.829263 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.829607 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.829924 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.830216 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.830436 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:06Z","lastTransitionTime":"2025-10-01T12:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.933085 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.933399 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.933575 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.933740 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:06 crc kubenswrapper[4727]: I1001 12:38:06.933905 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:06Z","lastTransitionTime":"2025-10-01T12:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.037258 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.037605 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.037719 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.037815 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.037945 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:07Z","lastTransitionTime":"2025-10-01T12:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.140293 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.140626 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.140764 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.140860 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.140969 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:07Z","lastTransitionTime":"2025-10-01T12:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.243958 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.244298 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.244364 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.244427 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.244484 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:07Z","lastTransitionTime":"2025-10-01T12:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.351600 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.351952 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.352034 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.352127 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.352200 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:07Z","lastTransitionTime":"2025-10-01T12:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.372047 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.372354 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:07 crc kubenswrapper[4727]: E1001 12:38:07.372359 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.372404 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:07 crc kubenswrapper[4727]: E1001 12:38:07.373049 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:07 crc kubenswrapper[4727]: E1001 12:38:07.373180 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.373759 4727 scope.go:117] "RemoveContainer" containerID="6187d80beb2c990275023fbaee35fd393ec4d863261e1626a11c59be69c9d327" Oct 01 12:38:07 crc kubenswrapper[4727]: E1001 12:38:07.374253 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pwx55_openshift-ovn-kubernetes(a908511b-2ce2-4a11-8dad-3867bee13f57)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.454509 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.454601 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.454612 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.454632 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.454644 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:07Z","lastTransitionTime":"2025-10-01T12:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.557277 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.557565 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.557647 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.557737 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.557816 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:07Z","lastTransitionTime":"2025-10-01T12:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.661018 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.661059 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.661071 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.661086 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.661099 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:07Z","lastTransitionTime":"2025-10-01T12:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.764392 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.764434 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.764446 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.764463 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.764473 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:07Z","lastTransitionTime":"2025-10-01T12:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.867027 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.867059 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.867067 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.867080 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.867089 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:07Z","lastTransitionTime":"2025-10-01T12:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.969206 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.969249 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.969262 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.969278 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:07 crc kubenswrapper[4727]: I1001 12:38:07.969289 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:07Z","lastTransitionTime":"2025-10-01T12:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.072291 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.072373 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.072393 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.072427 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.072545 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:08Z","lastTransitionTime":"2025-10-01T12:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.175275 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.175339 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.175358 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.175381 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.175396 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:08Z","lastTransitionTime":"2025-10-01T12:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.278270 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.278391 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.278411 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.278442 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.278461 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:08Z","lastTransitionTime":"2025-10-01T12:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.372084 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:08 crc kubenswrapper[4727]: E1001 12:38:08.372283 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.380248 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.380441 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.380503 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.380571 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.380635 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:08Z","lastTransitionTime":"2025-10-01T12:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.525091 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.525177 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.525187 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.525204 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.525217 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:08Z","lastTransitionTime":"2025-10-01T12:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.628818 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.628877 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.628889 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.628910 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.628926 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:08Z","lastTransitionTime":"2025-10-01T12:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.732951 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.733420 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.733451 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.733468 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.733480 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:08Z","lastTransitionTime":"2025-10-01T12:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.836734 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.836786 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.836797 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.836815 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.836824 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:08Z","lastTransitionTime":"2025-10-01T12:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.939844 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.939883 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.939892 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.939906 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:08 crc kubenswrapper[4727]: I1001 12:38:08.939915 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:08Z","lastTransitionTime":"2025-10-01T12:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.042298 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.042348 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.042357 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.042370 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.042379 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:09Z","lastTransitionTime":"2025-10-01T12:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.144377 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.144432 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.144444 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.144464 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.144477 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:09Z","lastTransitionTime":"2025-10-01T12:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.247361 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.247452 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.247476 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.247511 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.247541 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:09Z","lastTransitionTime":"2025-10-01T12:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.350309 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.350352 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.350364 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.350381 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.350392 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:09Z","lastTransitionTime":"2025-10-01T12:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.371846 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.371950 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.372031 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:09 crc kubenswrapper[4727]: E1001 12:38:09.372136 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:09 crc kubenswrapper[4727]: E1001 12:38:09.372257 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:09 crc kubenswrapper[4727]: E1001 12:38:09.372447 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.453245 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.453328 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.453351 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.453380 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.453401 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:09Z","lastTransitionTime":"2025-10-01T12:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.556589 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.556653 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.556668 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.556691 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.556710 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:09Z","lastTransitionTime":"2025-10-01T12:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.659085 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.659129 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.659139 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.659154 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.659165 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:09Z","lastTransitionTime":"2025-10-01T12:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.761558 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.761624 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.761646 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.761673 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.761691 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:09Z","lastTransitionTime":"2025-10-01T12:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.863961 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.864022 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.864035 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.864054 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.864064 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:09Z","lastTransitionTime":"2025-10-01T12:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.966978 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.967040 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.967050 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.967066 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:09 crc kubenswrapper[4727]: I1001 12:38:09.967074 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:09Z","lastTransitionTime":"2025-10-01T12:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.069779 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.069830 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.069839 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.069854 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.069862 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:10Z","lastTransitionTime":"2025-10-01T12:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.172531 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.172569 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.172579 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.172593 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.172604 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:10Z","lastTransitionTime":"2025-10-01T12:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.275553 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.275595 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.275606 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.275623 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.275634 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:10Z","lastTransitionTime":"2025-10-01T12:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.371946 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:10 crc kubenswrapper[4727]: E1001 12:38:10.372230 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.377881 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.377920 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.377931 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.377946 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.377962 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:10Z","lastTransitionTime":"2025-10-01T12:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.480537 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.480637 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.480656 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.480723 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.480747 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:10Z","lastTransitionTime":"2025-10-01T12:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.584251 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.584307 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.584324 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.584350 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.584368 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:10Z","lastTransitionTime":"2025-10-01T12:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.690430 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.690779 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.691526 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.691681 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.691835 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:10Z","lastTransitionTime":"2025-10-01T12:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.702840 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.702904 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.702918 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.702938 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.702952 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:10Z","lastTransitionTime":"2025-10-01T12:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:10 crc kubenswrapper[4727]: E1001 12:38:10.721462 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.726490 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.726541 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.726554 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.726574 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.726587 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:10Z","lastTransitionTime":"2025-10-01T12:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:10 crc kubenswrapper[4727]: E1001 12:38:10.744418 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.749522 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.749593 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.749609 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.749628 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.749662 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:10Z","lastTransitionTime":"2025-10-01T12:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:10 crc kubenswrapper[4727]: E1001 12:38:10.767592 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.772410 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.772444 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.772484 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.772502 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.772515 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:10Z","lastTransitionTime":"2025-10-01T12:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:10 crc kubenswrapper[4727]: E1001 12:38:10.792147 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.796674 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.796720 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.796733 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.796747 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.796758 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:10Z","lastTransitionTime":"2025-10-01T12:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:10 crc kubenswrapper[4727]: E1001 12:38:10.813045 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:10Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:10 crc kubenswrapper[4727]: E1001 12:38:10.813323 4727 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.815219 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.815266 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.815280 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.815297 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.815311 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:10Z","lastTransitionTime":"2025-10-01T12:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.918056 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.918110 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.918127 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.918151 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:10 crc kubenswrapper[4727]: I1001 12:38:10.918168 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:10Z","lastTransitionTime":"2025-10-01T12:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.021072 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.021175 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.021190 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.021206 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.021219 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:11Z","lastTransitionTime":"2025-10-01T12:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.124101 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.124159 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.124176 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.124200 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.124217 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:11Z","lastTransitionTime":"2025-10-01T12:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.226085 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.226112 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.226120 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.226135 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.226144 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:11Z","lastTransitionTime":"2025-10-01T12:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.329252 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.329309 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.329320 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.329335 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.329346 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:11Z","lastTransitionTime":"2025-10-01T12:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.371801 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.371873 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:11 crc kubenswrapper[4727]: E1001 12:38:11.371933 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.371823 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:11 crc kubenswrapper[4727]: E1001 12:38:11.372076 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:11 crc kubenswrapper[4727]: E1001 12:38:11.372148 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.431789 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.431859 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.431872 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.431890 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.431923 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:11Z","lastTransitionTime":"2025-10-01T12:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.534380 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.534434 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.534451 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.534472 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.534486 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:11Z","lastTransitionTime":"2025-10-01T12:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.626475 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7f4ab8d-5f57-47bd-93fc-9219c596c436-metrics-certs\") pod \"network-metrics-daemon-tvtzh\" (UID: \"f7f4ab8d-5f57-47bd-93fc-9219c596c436\") " pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:11 crc kubenswrapper[4727]: E1001 12:38:11.626616 4727 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:38:11 crc kubenswrapper[4727]: E1001 12:38:11.626667 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7f4ab8d-5f57-47bd-93fc-9219c596c436-metrics-certs podName:f7f4ab8d-5f57-47bd-93fc-9219c596c436 nodeName:}" failed. No retries permitted until 2025-10-01 12:38:43.626651919 +0000 UTC m=+101.948006756 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7f4ab8d-5f57-47bd-93fc-9219c596c436-metrics-certs") pod "network-metrics-daemon-tvtzh" (UID: "f7f4ab8d-5f57-47bd-93fc-9219c596c436") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.637202 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.637280 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.637302 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.637332 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.637352 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:11Z","lastTransitionTime":"2025-10-01T12:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.740298 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.740346 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.740356 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.740374 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.740383 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:11Z","lastTransitionTime":"2025-10-01T12:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.843449 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.843523 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.843545 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.843573 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.843595 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:11Z","lastTransitionTime":"2025-10-01T12:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.946152 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.946214 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.946238 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.946262 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:11 crc kubenswrapper[4727]: I1001 12:38:11.946278 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:11Z","lastTransitionTime":"2025-10-01T12:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.048588 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.048637 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.048649 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.048666 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.048678 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:12Z","lastTransitionTime":"2025-10-01T12:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.151355 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.151398 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.151407 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.151427 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.151438 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:12Z","lastTransitionTime":"2025-10-01T12:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.254632 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.254670 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.254683 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.254699 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.254710 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:12Z","lastTransitionTime":"2025-10-01T12:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.357160 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.357212 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.357231 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.357253 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.357269 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:12Z","lastTransitionTime":"2025-10-01T12:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.371732 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:12 crc kubenswrapper[4727]: E1001 12:38:12.371861 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.394338 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.409824 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.426680 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.449148 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b50096decee04773ae4447bce8059d65900e8d0b71b7bbca98419098bcea04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.459825 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.459853 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.459864 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.459879 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.459890 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:12Z","lastTransitionTime":"2025-10-01T12:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.468388 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tvtzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7f4ab8d-5f57-47bd-93fc-9219c596c436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tvtzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.495156 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.512653 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.528049 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.543360 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed486c6-587b-40ec-a908-064c3623b893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f28ab4deda37f2d065260409ffad7f3fd032a10ba6559420d948b94f0549e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1cabea4e68fe88d8cd24753367f3f9d696c0d6f8afd244ae6f4e1d3890d856a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gfkfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.563561 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.563609 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.563630 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.563658 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.563677 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:12Z","lastTransitionTime":"2025-10-01T12:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.563917 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6187d80beb2c990275023fbaee35fd393ec4d863261e1626a11c59be69c9d327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6187d80beb2c990275023fbaee35fd393ec4d863261e1626a11c59be69c9d327\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:37:52Z\\\",\\\"message\\\":\\\"37 6361 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-b9wkt in node crc\\\\nF1001 12:37:52.304941 6361 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z]\\\\nI1001 12:37:52.304727 6361 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pwx55_openshift-ovn-kubernetes(a908511b-2ce2-4a11-8dad-3867bee13f57)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.584085 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.598204 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bad51494-b8d2-4c83-b154-bdcb47072d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f792b3289e210881d451962f8c2fd7f66ba8e01540309210e4286af5c14056c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac7ffe118814edb3f763dce5c8d5adee0faab3a74f38abb06f39d0ffb91dea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40a5eee022677df9faef1fa90bae6dd0987ead513c125425b2aab5c5e635e47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5801d95e434a56f2ba9e6f26b212681adb8be1b6b4d046992ccb6461edd5434c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5801d95e434a56f2ba9e6f26b212681adb8be1b6b4d046992ccb6461edd5434c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.612566 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.624571 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.636666 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.651394 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.664395 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.665652 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.665688 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.665701 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.665719 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.665730 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:12Z","lastTransitionTime":"2025-10-01T12:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.676197 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.768546 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.768595 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.768605 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.768620 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.768629 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:12Z","lastTransitionTime":"2025-10-01T12:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.823319 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-slqxs_5cf1a0b8-9119-44c6-91ea-473317335fb9/kube-multus/0.log" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.823358 4727 generic.go:334] "Generic (PLEG): container finished" podID="5cf1a0b8-9119-44c6-91ea-473317335fb9" containerID="4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9" exitCode=1 Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.823384 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-slqxs" event={"ID":"5cf1a0b8-9119-44c6-91ea-473317335fb9","Type":"ContainerDied","Data":"4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9"} Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.823684 4727 scope.go:117] "RemoveContainer" containerID="4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.845606 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.868706 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"message\\\":\\\"2025-10-01T12:37:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8c24ab02-f658-4246-b158-adc587a43af1\\\\n2025-10-01T12:37:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8c24ab02-f658-4246-b158-adc587a43af1 to /host/opt/cni/bin/\\\\n2025-10-01T12:37:27Z [verbose] multus-daemon started\\\\n2025-10-01T12:37:27Z [verbose] Readiness Indicator file check\\\\n2025-10-01T12:38:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.871528 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.871659 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.871755 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.871836 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.871917 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:12Z","lastTransitionTime":"2025-10-01T12:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.888155 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6187d80beb2c990275023fbaee35fd393ec4d863261e1626a11c59be69c9d327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6187d80beb2c990275023fbaee35fd393ec4d863261e1626a11c59be69c9d327\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:37:52Z\\\",\\\"message\\\":\\\"37 6361 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-b9wkt in node crc\\\\nF1001 12:37:52.304941 6361 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z]\\\\nI1001 12:37:52.304727 6361 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pwx55_openshift-ovn-kubernetes(a908511b-2ce2-4a11-8dad-3867bee13f57)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.906756 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.917951 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bad51494-b8d2-4c83-b154-bdcb47072d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f792b3289e210881d451962f8c2fd7f66ba8e01540309210e4286af5c14056c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac7ffe118814edb3f763dce5c8d5adee0faab3a74f38abb06f39d0ffb91dea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40a5eee022677df9faef1fa90bae6dd0987ead513c125425b2aab5c5e635e47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5801d95e434a56f2ba9e6f26b212681adb8be1b6b4d046992ccb6461edd5434c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5801d95e434a56f2ba9e6f26b212681adb8be1b6b4d046992ccb6461edd5434c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.928786 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.938691 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.950295 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.962384 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.975603 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.975638 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.975647 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.975662 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.975671 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:12Z","lastTransitionTime":"2025-10-01T12:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.975911 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b50096decee04773ae4447bce8059d65900e8d0b71b7bbca98419098bcea04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:12 crc kubenswrapper[4727]: I1001 12:38:12.987854 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.001141 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:12Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.012660 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.025269 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.037344 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed486c6-587b-40ec-a908-064c3623b893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f28ab4deda37f2d065260409ffad7f3fd032a10ba6559420d948b94f0549e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1cabea4e68fe88d8cd24753367f3f9d696c0d6f8afd244ae6f4e1d3890d856a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gfkfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.048449 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tvtzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7f4ab8d-5f57-47bd-93fc-9219c596c436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tvtzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.070294 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.078382 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.078450 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.078469 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.078502 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.078521 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:13Z","lastTransitionTime":"2025-10-01T12:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.087197 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.181514 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.181579 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.181605 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.181644 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.181668 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:13Z","lastTransitionTime":"2025-10-01T12:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.283690 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.283718 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.283725 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.283739 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.283749 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:13Z","lastTransitionTime":"2025-10-01T12:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.372330 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.372369 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.372407 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:13 crc kubenswrapper[4727]: E1001 12:38:13.372539 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:13 crc kubenswrapper[4727]: E1001 12:38:13.372680 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:13 crc kubenswrapper[4727]: E1001 12:38:13.372787 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.387348 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.387399 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.387408 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.387423 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.387432 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:13Z","lastTransitionTime":"2025-10-01T12:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.490427 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.490464 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.490475 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.490490 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.490501 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:13Z","lastTransitionTime":"2025-10-01T12:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.593656 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.594070 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.594180 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.594284 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.594368 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:13Z","lastTransitionTime":"2025-10-01T12:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.697866 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.697915 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.697927 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.697944 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.697957 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:13Z","lastTransitionTime":"2025-10-01T12:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.802450 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.802494 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.802505 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.802522 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.802532 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:13Z","lastTransitionTime":"2025-10-01T12:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.827493 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-slqxs_5cf1a0b8-9119-44c6-91ea-473317335fb9/kube-multus/0.log" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.827562 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-slqxs" event={"ID":"5cf1a0b8-9119-44c6-91ea-473317335fb9","Type":"ContainerStarted","Data":"646bb050f901e31d33162aa5191505e91edf58a243c2dac9bf5b84e99bcebe1c"} Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.843346 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.855156 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bad51494-b8d2-4c83-b154-bdcb47072d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f792b3289e210881d451962f8c2fd7f66ba8e01540309210e4286af5c14056c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac7ffe118814edb3f763dce5c8d5adee0faab3a74f38abb06f39d0ffb91dea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40a5eee022677df9faef1fa90bae6dd0987ead513c125425b2aab5c5e635e47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5801d95e434a56f2ba9e6f26b212681adb8be1b6b4d046992ccb6461edd5434c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5801d95e434a56f2ba9e6f26b212681adb8be1b6b4d046992ccb6461edd5434c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.871829 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.885788 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.900315 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646bb050f901e31d33162aa5191505e91edf58a243c2dac9bf5b84e99bcebe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"message\\\":\\\"2025-10-01T12:37:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8c24ab02-f658-4246-b158-adc587a43af1\\\\n2025-10-01T12:37:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8c24ab02-f658-4246-b158-adc587a43af1 to /host/opt/cni/bin/\\\\n2025-10-01T12:37:27Z [verbose] multus-daemon started\\\\n2025-10-01T12:37:27Z [verbose] Readiness Indicator file check\\\\n2025-10-01T12:38:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.905300 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.905548 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.905798 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.905952 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.906076 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:13Z","lastTransitionTime":"2025-10-01T12:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.920660 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6187d80beb2c990275023fbaee35fd393ec4d863261e1626a11c59be69c9d327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6187d80beb2c990275023fbaee35fd393ec4d863261e1626a11c59be69c9d327\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:37:52Z\\\",\\\"message\\\":\\\"37 6361 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-b9wkt in node crc\\\\nF1001 12:37:52.304941 6361 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z]\\\\nI1001 12:37:52.304727 6361 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pwx55_openshift-ovn-kubernetes(a908511b-2ce2-4a11-8dad-3867bee13f57)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.937502 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.952066 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.963158 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.979667 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:13 crc kubenswrapper[4727]: I1001 12:38:13.994148 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:13Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.008860 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.008899 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.008910 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.008926 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.008937 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:14Z","lastTransitionTime":"2025-10-01T12:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.011742 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:14Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.038261 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b50096decee04773ae4447bce8059d65900e8d0b71b7bbca98419098bcea04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:14Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.062325 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:14Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.079272 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:14Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.095257 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:14Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.108262 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed486c6-587b-40ec-a908-064c3623b893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f28ab4deda37f2d065260409ffad7f3fd032a10ba6559420d948b94f0549e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1cabea4e68fe88d8cd24753367f3f9d696c0d6f8afd244ae6f4e1d3890d856a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gfkfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:14Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.110697 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.110813 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.110894 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.110982 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.111080 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:14Z","lastTransitionTime":"2025-10-01T12:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.121770 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tvtzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7f4ab8d-5f57-47bd-93fc-9219c596c436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tvtzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:14Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.214050 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.214507 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.214638 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.214992 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.215185 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:14Z","lastTransitionTime":"2025-10-01T12:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.317833 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.318306 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.318458 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.318619 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.318852 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:14Z","lastTransitionTime":"2025-10-01T12:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.372311 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:14 crc kubenswrapper[4727]: E1001 12:38:14.372558 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.422062 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.422118 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.422131 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.422155 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.422168 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:14Z","lastTransitionTime":"2025-10-01T12:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.524860 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.524915 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.524930 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.524950 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.524966 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:14Z","lastTransitionTime":"2025-10-01T12:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.627330 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.627376 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.627405 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.627425 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.627435 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:14Z","lastTransitionTime":"2025-10-01T12:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.730256 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.730329 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.730341 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.730358 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.730370 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:14Z","lastTransitionTime":"2025-10-01T12:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.831988 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.832046 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.832062 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.832078 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.832092 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:14Z","lastTransitionTime":"2025-10-01T12:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.935083 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.935126 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.935136 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.935156 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:14 crc kubenswrapper[4727]: I1001 12:38:14.935168 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:14Z","lastTransitionTime":"2025-10-01T12:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.038268 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.038325 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.038341 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.038362 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.038375 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:15Z","lastTransitionTime":"2025-10-01T12:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.140119 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.140167 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.140179 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.140196 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.140207 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:15Z","lastTransitionTime":"2025-10-01T12:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.242541 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.242580 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.242591 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.242606 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.242619 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:15Z","lastTransitionTime":"2025-10-01T12:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.344305 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.344350 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.344364 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.344380 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.344391 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:15Z","lastTransitionTime":"2025-10-01T12:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.371665 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:15 crc kubenswrapper[4727]: E1001 12:38:15.371758 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.371674 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:15 crc kubenswrapper[4727]: E1001 12:38:15.371819 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.371665 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:15 crc kubenswrapper[4727]: E1001 12:38:15.371860 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.446927 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.446976 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.446992 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.447292 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.447356 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:15Z","lastTransitionTime":"2025-10-01T12:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.550528 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.550573 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.550584 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.550600 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.550611 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:15Z","lastTransitionTime":"2025-10-01T12:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.653797 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.653848 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.653865 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.653886 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.653902 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:15Z","lastTransitionTime":"2025-10-01T12:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.756790 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.756838 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.756851 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.756870 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.756881 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:15Z","lastTransitionTime":"2025-10-01T12:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.859229 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.859272 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.859284 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.859300 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.859313 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:15Z","lastTransitionTime":"2025-10-01T12:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.961586 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.962200 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.962218 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.962235 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:15 crc kubenswrapper[4727]: I1001 12:38:15.962247 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:15Z","lastTransitionTime":"2025-10-01T12:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.065772 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.065832 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.065843 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.065863 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.065876 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:16Z","lastTransitionTime":"2025-10-01T12:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.169310 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.169354 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.169365 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.169385 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.169398 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:16Z","lastTransitionTime":"2025-10-01T12:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.273524 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.273607 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.273624 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.273694 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.273714 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:16Z","lastTransitionTime":"2025-10-01T12:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.371687 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:16 crc kubenswrapper[4727]: E1001 12:38:16.371983 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.377189 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.377306 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.377327 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.377346 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.377360 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:16Z","lastTransitionTime":"2025-10-01T12:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.484927 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.485527 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.485729 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.485765 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.485789 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:16Z","lastTransitionTime":"2025-10-01T12:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.589213 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.589272 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.589284 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.589307 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.589322 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:16Z","lastTransitionTime":"2025-10-01T12:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.694705 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.694779 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.694791 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.694811 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.694825 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:16Z","lastTransitionTime":"2025-10-01T12:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.797503 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.797550 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.797564 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.797583 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.797598 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:16Z","lastTransitionTime":"2025-10-01T12:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.901022 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.901072 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.901084 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.901105 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:16 crc kubenswrapper[4727]: I1001 12:38:16.901118 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:16Z","lastTransitionTime":"2025-10-01T12:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.003398 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.003493 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.003517 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.003550 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.003575 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:17Z","lastTransitionTime":"2025-10-01T12:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.106566 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.106629 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.106642 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.106665 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.106680 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:17Z","lastTransitionTime":"2025-10-01T12:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.209947 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.210322 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.210419 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.210519 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.210617 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:17Z","lastTransitionTime":"2025-10-01T12:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.313928 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.314233 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.314348 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.314428 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.314493 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:17Z","lastTransitionTime":"2025-10-01T12:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.371798 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.371800 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.372402 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:17 crc kubenswrapper[4727]: E1001 12:38:17.372523 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:17 crc kubenswrapper[4727]: E1001 12:38:17.372789 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:17 crc kubenswrapper[4727]: E1001 12:38:17.372870 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.417924 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.417993 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.418053 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.418079 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.418098 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:17Z","lastTransitionTime":"2025-10-01T12:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.521472 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.521529 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.521546 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.521569 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.521587 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:17Z","lastTransitionTime":"2025-10-01T12:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.624248 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.624286 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.624297 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.624314 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.624327 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:17Z","lastTransitionTime":"2025-10-01T12:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.726795 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.727134 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.727226 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.727318 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.727406 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:17Z","lastTransitionTime":"2025-10-01T12:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.831087 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.831188 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.831229 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.831260 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.831278 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:17Z","lastTransitionTime":"2025-10-01T12:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.934846 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.934895 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.934914 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.934938 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:17 crc kubenswrapper[4727]: I1001 12:38:17.934956 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:17Z","lastTransitionTime":"2025-10-01T12:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.038611 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.038656 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.038667 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.038683 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.038695 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:18Z","lastTransitionTime":"2025-10-01T12:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.141719 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.141807 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.141856 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.141889 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.141949 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:18Z","lastTransitionTime":"2025-10-01T12:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.244516 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.244581 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.244594 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.244611 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.244624 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:18Z","lastTransitionTime":"2025-10-01T12:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.347805 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.347899 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.347910 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.347923 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.347933 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:18Z","lastTransitionTime":"2025-10-01T12:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.371856 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:18 crc kubenswrapper[4727]: E1001 12:38:18.372074 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.373435 4727 scope.go:117] "RemoveContainer" containerID="6187d80beb2c990275023fbaee35fd393ec4d863261e1626a11c59be69c9d327" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.451506 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.451543 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.451552 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.451567 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.451576 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:18Z","lastTransitionTime":"2025-10-01T12:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.554487 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.554530 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.554540 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.554557 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.554570 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:18Z","lastTransitionTime":"2025-10-01T12:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.657808 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.657844 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.657857 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.657874 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.657886 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:18Z","lastTransitionTime":"2025-10-01T12:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.760820 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.760886 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.760903 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.760928 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.760944 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:18Z","lastTransitionTime":"2025-10-01T12:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.846448 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwx55_a908511b-2ce2-4a11-8dad-3867bee13f57/ovnkube-controller/2.log" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.850065 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" event={"ID":"a908511b-2ce2-4a11-8dad-3867bee13f57","Type":"ContainerStarted","Data":"69090b800e3d9e3cac2f5bb288478653d9be161e2aa288dc851e6cdd0acd1b57"} Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.863083 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.863132 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.863145 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.863172 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.863184 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:18Z","lastTransitionTime":"2025-10-01T12:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.965942 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.966028 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.966037 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.966052 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:18 crc kubenswrapper[4727]: I1001 12:38:18.966061 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:18Z","lastTransitionTime":"2025-10-01T12:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.068752 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.068822 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.068840 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.068865 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.068883 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:19Z","lastTransitionTime":"2025-10-01T12:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.171868 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.171939 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.171953 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.171978 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.172015 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:19Z","lastTransitionTime":"2025-10-01T12:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.274700 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.274764 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.274782 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.274807 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.274826 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:19Z","lastTransitionTime":"2025-10-01T12:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.371940 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.372012 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.372125 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:19 crc kubenswrapper[4727]: E1001 12:38:19.372182 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:19 crc kubenswrapper[4727]: E1001 12:38:19.372342 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:19 crc kubenswrapper[4727]: E1001 12:38:19.372482 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.377677 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.377721 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.377733 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.377751 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.377765 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:19Z","lastTransitionTime":"2025-10-01T12:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.480822 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.480887 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.480905 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.480929 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.480947 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:19Z","lastTransitionTime":"2025-10-01T12:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.583866 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.583950 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.583967 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.583990 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.584040 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:19Z","lastTransitionTime":"2025-10-01T12:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.686491 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.686554 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.686573 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.686598 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.686617 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:19Z","lastTransitionTime":"2025-10-01T12:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.789407 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.789465 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.789477 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.789495 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.789529 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:19Z","lastTransitionTime":"2025-10-01T12:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.855065 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwx55_a908511b-2ce2-4a11-8dad-3867bee13f57/ovnkube-controller/3.log" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.855893 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwx55_a908511b-2ce2-4a11-8dad-3867bee13f57/ovnkube-controller/2.log" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.858750 4727 generic.go:334] "Generic (PLEG): container finished" podID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerID="69090b800e3d9e3cac2f5bb288478653d9be161e2aa288dc851e6cdd0acd1b57" exitCode=1 Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.858795 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" event={"ID":"a908511b-2ce2-4a11-8dad-3867bee13f57","Type":"ContainerDied","Data":"69090b800e3d9e3cac2f5bb288478653d9be161e2aa288dc851e6cdd0acd1b57"} Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.858841 4727 scope.go:117] "RemoveContainer" containerID="6187d80beb2c990275023fbaee35fd393ec4d863261e1626a11c59be69c9d327" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.859976 4727 scope.go:117] "RemoveContainer" containerID="69090b800e3d9e3cac2f5bb288478653d9be161e2aa288dc851e6cdd0acd1b57" Oct 01 12:38:19 crc kubenswrapper[4727]: E1001 12:38:19.860311 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pwx55_openshift-ovn-kubernetes(a908511b-2ce2-4a11-8dad-3867bee13f57)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.872242 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.884912 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.892211 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.892289 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.892307 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.892354 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.892374 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:19Z","lastTransitionTime":"2025-10-01T12:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.899988 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.921182 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b50096decee04773ae4447bce8059d65900e8d0b71b7bbca98419098bcea04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.941529 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.959708 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.973509 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.986853 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed486c6-587b-40ec-a908-064c3623b893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f28ab4deda37f2d065260409ffad7f3fd032a10ba6559420d948b94f0549e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1cabea4e68fe88d8cd24753367f3f9d696c0d6f8afd244ae6f4e1d3890d856a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gfkfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.995608 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.995690 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.995737 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.995758 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:19 crc kubenswrapper[4727]: I1001 12:38:19.995771 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:19Z","lastTransitionTime":"2025-10-01T12:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.001336 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tvtzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7f4ab8d-5f57-47bd-93fc-9219c596c436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tvtzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.022074 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.035744 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bad51494-b8d2-4c83-b154-bdcb47072d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f792b3289e210881d451962f8c2fd7f66ba8e01540309210e4286af5c14056c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac7ffe118814edb3f763dce5c8d5adee0faab3a74f38abb06f39d0ffb91dea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40a5eee022677df9faef1fa90bae6dd0987ead513c125425b2aab5c5e635e47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5801d95e434a56f2ba9e6f26b212681adb8be1b6b4d046992ccb6461edd5434c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5801d95e434a56f2ba9e6f26b212681adb8be1b6b4d046992ccb6461edd5434c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.051067 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.064860 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.078880 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646bb050f901e31d33162aa5191505e91edf58a243c2dac9bf5b84e99bcebe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"message\\\":\\\"2025-10-01T12:37:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8c24ab02-f658-4246-b158-adc587a43af1\\\\n2025-10-01T12:37:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8c24ab02-f658-4246-b158-adc587a43af1 to /host/opt/cni/bin/\\\\n2025-10-01T12:37:27Z [verbose] multus-daemon started\\\\n2025-10-01T12:37:27Z [verbose] Readiness Indicator file check\\\\n2025-10-01T12:38:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.098211 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.098251 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.098262 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.098280 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.098296 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:20Z","lastTransitionTime":"2025-10-01T12:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.101161 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69090b800e3d9e3cac2f5bb288478653d9be161e2aa288dc851e6cdd0acd1b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6187d80beb2c990275023fbaee35fd393ec4d863261e1626a11c59be69c9d327\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:37:52Z\\\",\\\"message\\\":\\\"37 6361 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-b9wkt in node crc\\\\nF1001 12:37:52.304941 6361 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:37:52Z is after 2025-08-24T17:21:41Z]\\\\nI1001 12:37:52.304727 6361 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69090b800e3d9e3cac2f5bb288478653d9be161e2aa288dc851e6cdd0acd1b57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:19Z\\\",\\\"message\\\":\\\"ervices.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1001 12:38:19.760838 6713 services_controller.go:443] Built service openshift-authentication/oauth-openshift LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.222\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1001 12:38:19.760849 6713 services_controller.go:444] Built service openshift-authentication/oauth-openshift LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1001 12:38:19.760856 6713 services_controller.go:445] Built service openshift-authentication/oauth-openshift LB template configs for network=default: []services.lbConfig(nil)\\\\nI1001 12:38:19.760866 6713 port_cache.go:96] port-cache(openshift-network-diagnostics_network-check-target-xd92c): added port \\\\u0026{name:openshift-network-diagnostics_network-check-target-xd92c uuid:61897e97-c771-4738-8709-09636387cb00 logicalSwitch:crc ips:[0xc009129590] mac:[10 88 10 217 0 4] expires:{wall:0 ext:0 loc:\\\\u003cnil\\\\u003e}} with IP: [10.217.0.4/23] and MAC: 0a:58:0a:d9:00:04\\\\nI1001 12:38:19.760365 6713 services_controller.go:454]\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.114708 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.129184 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.140345 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.200582 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.200624 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.200636 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.200652 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.200662 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:20Z","lastTransitionTime":"2025-10-01T12:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.303982 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.304070 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.304080 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.304100 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.304112 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:20Z","lastTransitionTime":"2025-10-01T12:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.372139 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:20 crc kubenswrapper[4727]: E1001 12:38:20.372382 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.406608 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.406651 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.406663 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.406680 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.406692 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:20Z","lastTransitionTime":"2025-10-01T12:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.514913 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.514995 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.515025 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.515046 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.515064 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:20Z","lastTransitionTime":"2025-10-01T12:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.618231 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.618294 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.618312 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.618336 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.618353 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:20Z","lastTransitionTime":"2025-10-01T12:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.722948 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.723038 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.723058 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.723085 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.723126 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:20Z","lastTransitionTime":"2025-10-01T12:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.826433 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.826523 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.826585 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.826607 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.826623 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:20Z","lastTransitionTime":"2025-10-01T12:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.865768 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwx55_a908511b-2ce2-4a11-8dad-3867bee13f57/ovnkube-controller/3.log" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.870232 4727 scope.go:117] "RemoveContainer" containerID="69090b800e3d9e3cac2f5bb288478653d9be161e2aa288dc851e6cdd0acd1b57" Oct 01 12:38:20 crc kubenswrapper[4727]: E1001 12:38:20.870399 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pwx55_openshift-ovn-kubernetes(a908511b-2ce2-4a11-8dad-3867bee13f57)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.890371 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.909038 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.927611 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.930722 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.930774 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.930790 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.930817 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.930836 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:20Z","lastTransitionTime":"2025-10-01T12:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.942384 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b50096decee04773ae4447bce8059d65900e8d0b71b7bbca98419098bcea04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.959641 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.973194 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:20 crc kubenswrapper[4727]: I1001 12:38:20.996839 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:20Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.011644 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed486c6-587b-40ec-a908-064c3623b893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f28ab4deda37f2d065260409ffad7f3fd032a10ba6559420d948b94f0549e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1cabea4e68fe88d8cd24753367f3f9d696c0d6f8afd244ae6f4e1d3890d856a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gfkfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.026287 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tvtzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7f4ab8d-5f57-47bd-93fc-9219c596c436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tvtzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.033573 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.033619 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.033631 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.033650 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.033663 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:21Z","lastTransitionTime":"2025-10-01T12:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.045774 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.060850 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bad51494-b8d2-4c83-b154-bdcb47072d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f792b3289e210881d451962f8c2fd7f66ba8e01540309210e4286af5c14056c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac7ffe118814edb3f763dce5c8d5adee0faab3a74f38abb06f39d0ffb91dea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40a5eee022677df9faef1fa90bae6dd0987ead513c125425b2aab5c5e635e47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5801d95e434a56f2ba9e6f26b212681adb8be1b6b4d046992ccb6461edd5434c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5801d95e434a56f2ba9e6f26b212681adb8be1b6b4d046992ccb6461edd5434c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.075169 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.089325 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.106767 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646bb050f901e31d33162aa5191505e91edf58a243c2dac9bf5b84e99bcebe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"message\\\":\\\"2025-10-01T12:37:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8c24ab02-f658-4246-b158-adc587a43af1\\\\n2025-10-01T12:37:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8c24ab02-f658-4246-b158-adc587a43af1 to /host/opt/cni/bin/\\\\n2025-10-01T12:37:27Z [verbose] multus-daemon started\\\\n2025-10-01T12:37:27Z [verbose] Readiness Indicator file check\\\\n2025-10-01T12:38:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.136909 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.136977 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.136995 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.137043 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.137058 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:21Z","lastTransitionTime":"2025-10-01T12:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.137935 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69090b800e3d9e3cac2f5bb288478653d9be161e2aa288dc851e6cdd0acd1b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69090b800e3d9e3cac2f5bb288478653d9be161e2aa288dc851e6cdd0acd1b57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:19Z\\\",\\\"message\\\":\\\"ervices.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1001 12:38:19.760838 6713 services_controller.go:443] Built service openshift-authentication/oauth-openshift LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.222\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1001 12:38:19.760849 6713 services_controller.go:444] Built service openshift-authentication/oauth-openshift LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1001 12:38:19.760856 6713 services_controller.go:445] Built service openshift-authentication/oauth-openshift LB template configs for network=default: []services.lbConfig(nil)\\\\nI1001 12:38:19.760866 6713 port_cache.go:96] port-cache(openshift-network-diagnostics_network-check-target-xd92c): added port \\\\u0026{name:openshift-network-diagnostics_network-check-target-xd92c uuid:61897e97-c771-4738-8709-09636387cb00 logicalSwitch:crc ips:[0xc009129590] mac:[10 88 10 217 0 4] expires:{wall:0 ext:0 loc:\\\\u003cnil\\\\u003e}} with IP: [10.217.0.4/23] and MAC: 0a:58:0a:d9:00:04\\\\nI1001 12:38:19.760365 6713 services_controller.go:454]\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pwx55_openshift-ovn-kubernetes(a908511b-2ce2-4a11-8dad-3867bee13f57)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.138675 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.138745 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.138760 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.138804 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.138818 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:21Z","lastTransitionTime":"2025-10-01T12:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.156285 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4727]: E1001 12:38:21.158037 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.162869 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.162941 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.162959 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.162983 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.163030 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:21Z","lastTransitionTime":"2025-10-01T12:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.174039 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4727]: E1001 12:38:21.181497 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.186897 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.186977 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.187014 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.187040 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.187059 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:21Z","lastTransitionTime":"2025-10-01T12:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.188342 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4727]: E1001 12:38:21.202075 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.206510 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.206695 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.206809 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.206897 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.206970 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:21Z","lastTransitionTime":"2025-10-01T12:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:21 crc kubenswrapper[4727]: E1001 12:38:21.220067 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.227483 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.227525 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.227534 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.227549 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.227559 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:21Z","lastTransitionTime":"2025-10-01T12:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:21 crc kubenswrapper[4727]: E1001 12:38:21.238936 4727 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b442e64-06eb-4ef0-99a3-e242f42c1322\\\",\\\"systemUUID\\\":\\\"08ba6cbf-28d5-4f2d-86d9-787fd74364b2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:21Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:21 crc kubenswrapper[4727]: E1001 12:38:21.239097 4727 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.240490 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.240563 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.240587 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.240612 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.240635 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:21Z","lastTransitionTime":"2025-10-01T12:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.344163 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.344247 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.344272 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.344302 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.344323 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:21Z","lastTransitionTime":"2025-10-01T12:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.369874 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.372298 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.372370 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:21 crc kubenswrapper[4727]: E1001 12:38:21.372517 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:21 crc kubenswrapper[4727]: E1001 12:38:21.372682 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.372936 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:21 crc kubenswrapper[4727]: E1001 12:38:21.373374 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.447421 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.447481 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.447497 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.447520 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.447536 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:21Z","lastTransitionTime":"2025-10-01T12:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.550892 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.550955 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.550972 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.551075 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.551122 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:21Z","lastTransitionTime":"2025-10-01T12:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.654083 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.654140 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.654158 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.654182 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.654199 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:21Z","lastTransitionTime":"2025-10-01T12:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.757489 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.757566 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.757590 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.757621 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.757670 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:21Z","lastTransitionTime":"2025-10-01T12:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.860380 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.860426 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.860441 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.860461 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.860475 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:21Z","lastTransitionTime":"2025-10-01T12:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.873179 4727 scope.go:117] "RemoveContainer" containerID="69090b800e3d9e3cac2f5bb288478653d9be161e2aa288dc851e6cdd0acd1b57" Oct 01 12:38:21 crc kubenswrapper[4727]: E1001 12:38:21.873359 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pwx55_openshift-ovn-kubernetes(a908511b-2ce2-4a11-8dad-3867bee13f57)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.963368 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.963437 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.963456 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.963481 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:21 crc kubenswrapper[4727]: I1001 12:38:21.963499 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:21Z","lastTransitionTime":"2025-10-01T12:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.066751 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.066806 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.066820 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.066840 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.066853 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:22Z","lastTransitionTime":"2025-10-01T12:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.169433 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.169489 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.169510 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.169532 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.169546 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:22Z","lastTransitionTime":"2025-10-01T12:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.272903 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.273540 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.273765 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.273992 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.274169 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:22Z","lastTransitionTime":"2025-10-01T12:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.372058 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:22 crc kubenswrapper[4727]: E1001 12:38:22.372351 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.377050 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.377096 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.377113 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.377140 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.377153 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:22Z","lastTransitionTime":"2025-10-01T12:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.389135 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18290ae-64a5-44a5-a704-90977d85852b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570820cbb49975c8b566a33c39df7fb5dd01d82c46aeed720c7f74c84ab47ead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-56tnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c7jw9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.408561 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1062bbb4-dd72-4659-91b3-2aa9f1b6a1ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b50096decee04773ae4447bce8059d65900e8d0b71b7bbca98419098bcea04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78bcb09b23d489275dc5b74a46de8da93c8eed7943d8bbb6f5bade1eff979bf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91a871ddf8f735d1974f234e0bda0104c8e6e4f6f83d59d9f96a4d92ebebb85e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8dd20b07642532316b3905b77c880ac6206bfd00f382703b52b7e391de183966\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://446e55d1d0e1727ac6f6b4395b3bb0d3e7f41b5e6416640b941be0da531fce44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013bedf4b48e605709c01790074624747fdffdb994b1b7ac5671573f8c49f8bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9d6828182ef2af6eb153dde965c8a801fd4b4699acb40576db6c5968a76b63f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nh2x2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nfgjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.430229 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.446892 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fjlgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"972e1ff9-8a88-471a-b5e6-73f16af6df57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7d52e547226ecda7d54af931ff801a3fc2128ef63c797ad48b31aca7d1359db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rqctk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:24Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fjlgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.474114 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e73ba789d71d08476403f152be94f934c3ee92f2631568dc05cccd69a881693a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.479187 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.479593 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.479809 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.479943 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.480078 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:22Z","lastTransitionTime":"2025-10-01T12:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.490437 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bed486c6-587b-40ec-a908-064c3623b893\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f28ab4deda37f2d065260409ffad7f3fd032a10ba6559420d948b94f0549e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1cabea4e68fe88d8cd24753367f3f9d696c0d6f8afd244ae6f4e1d3890d856a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-twp42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gfkfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.503609 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-tvtzh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7f4ab8d-5f57-47bd-93fc-9219c596c436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ljxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-tvtzh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.529275 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a2c2aba-0d9e-458c-9503-41beecb2b37d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f26d8cf74e3e1e650a4f0e14287b9b022195c5abcc9a7271c2b3389aacddf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b0921dcb1bdf9618494aa37873cee8877d5e45c5f782eb7cbd4c8e060551d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3302dcdcaffd3212e090ed59d6b4f88818af212131b353bd36df805f96401083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a81a913d53610e4c9b2f990ba54bda89b863f5bec3913276221d7d423c6d16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3705f78dde89cb2b852e8c0fc1fc8984a33ad68599ff155bade892e2b341ce66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://126170299a629758fe695100b8eecb1a434b52e3444ed07f376b7b2d50771318\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2899b55ee60dd3c8e19cd3339ad14998cc8d81ee92f824ef2bf4b9a62737261f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d09472599fa09d8d136cc1d1fa72d6122859be8871bb34bf07bf813ba6eacd5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.541377 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42f35b03-a7ed-4d43-9541-341a326f3f6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74561a9bcc71769c3ccd6201c598f6da6ded5cf31bb2cb27ea0595b65d43c92e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://04abf069e2eebf352c036924508780a14e287de39c3380dc309b5d5412cae7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62e9fd647f42378a0ea4c00afc357bad93bfc74e2bebc1f152f4943f0fbb7056\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8670c6a092501da58e51cb550754e59486418deafda266336baa16a2d907512\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.554726 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://397042fef7f24ab5dcfff85eba877b52e364e8f1969b433d0be93c17ea3e6541\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f045720190820051c7656e39fb602718c4b5e82d53870fae6f067cb6ab6b885d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.569963 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.582889 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-slqxs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5cf1a0b8-9119-44c6-91ea-473317335fb9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:38:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://646bb050f901e31d33162aa5191505e91edf58a243c2dac9bf5b84e99bcebe1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:12Z\\\",\\\"message\\\":\\\"2025-10-01T12:37:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8c24ab02-f658-4246-b158-adc587a43af1\\\\n2025-10-01T12:37:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8c24ab02-f658-4246-b158-adc587a43af1 to /host/opt/cni/bin/\\\\n2025-10-01T12:37:27Z [verbose] multus-daemon started\\\\n2025-10-01T12:37:27Z [verbose] Readiness Indicator file check\\\\n2025-10-01T12:38:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:38:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dc8jc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-slqxs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.584461 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.584485 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.584495 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.584521 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.584531 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:22Z","lastTransitionTime":"2025-10-01T12:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.602031 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a908511b-2ce2-4a11-8dad-3867bee13f57\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69090b800e3d9e3cac2f5bb288478653d9be161e2aa288dc851e6cdd0acd1b57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69090b800e3d9e3cac2f5bb288478653d9be161e2aa288dc851e6cdd0acd1b57\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T12:38:19Z\\\",\\\"message\\\":\\\"ervices.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1001 12:38:19.760838 6713 services_controller.go:443] Built service openshift-authentication/oauth-openshift LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.222\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1001 12:38:19.760849 6713 services_controller.go:444] Built service openshift-authentication/oauth-openshift LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1001 12:38:19.760856 6713 services_controller.go:445] Built service openshift-authentication/oauth-openshift LB template configs for network=default: []services.lbConfig(nil)\\\\nI1001 12:38:19.760866 6713 port_cache.go:96] port-cache(openshift-network-diagnostics_network-check-target-xd92c): added port \\\\u0026{name:openshift-network-diagnostics_network-check-target-xd92c uuid:61897e97-c771-4738-8709-09636387cb00 logicalSwitch:crc ips:[0xc009129590] mac:[10 88 10 217 0 4] expires:{wall:0 ext:0 loc:\\\\u003cnil\\\\u003e}} with IP: [10.217.0.4/23] and MAC: 0a:58:0a:d9:00:04\\\\nI1001 12:38:19.760365 6713 services_controller.go:454]\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:38:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pwx55_openshift-ovn-kubernetes(a908511b-2ce2-4a11-8dad-3867bee13f57)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-txq6l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwx55\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.619921 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47c3da6f-7e51-4a6c-b23f-7d7e982b67d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://000c4c1e651175b4f7862e970978277461929d07b3f608fe4c62c6e2944bb2d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4551da61b22157e4fd24b9cb9223c281965ae189908b465070cbab5338966c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b0561dae24f61e321875505d0be6b5bb9175b7147df4b27dbf8c41bf3f5d88f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ca91f291bd273d6c981b4075f6746d7aa11d920a3b763248052e79998e2d742\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bf43b62207f579a0aa81dd8af183e16ab28ee8378e765d9eacb536d385fa62f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 12:37:16.899187 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 12:37:16.903429 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2893009292/tls.crt::/tmp/serving-cert-2893009292/tls.key\\\\\\\"\\\\nI1001 12:37:23.162428 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 12:37:23.167481 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 12:37:23.167509 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 12:37:23.167531 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 12:37:23.167537 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 12:37:23.175532 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 12:37:23.175568 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 12:37:23.175576 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1001 12:37:23.175577 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1001 12:37:23.175581 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 12:37:23.175601 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 12:37:23.175605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 12:37:23.175612 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 12:37:23.178592 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54eca2bc359f29167150391d8a18b18774b3341e94ecda583e370e7fbc35430\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b49d87732beb2784cc09c47d76b0180cf008a0e55afae45049dde0e491b5adbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.635305 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bad51494-b8d2-4c83-b154-bdcb47072d87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f792b3289e210881d451962f8c2fd7f66ba8e01540309210e4286af5c14056c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ac7ffe118814edb3f763dce5c8d5adee0faab3a74f38abb06f39d0ffb91dea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40a5eee022677df9faef1fa90bae6dd0987ead513c125425b2aab5c5e635e47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5801d95e434a56f2ba9e6f26b212681adb8be1b6b4d046992ccb6461edd5434c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5801d95e434a56f2ba9e6f26b212681adb8be1b6b4d046992ccb6461edd5434c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T12:37:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T12:37:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.647905 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9wkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10dcb95f-031f-4e4c-bf15-0c8e1b53674a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375e0a4333f2d382d64ddefbba925e39dc9e06873032c7cabaab19da6b028ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ml7nw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T12:37:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9wkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.663538 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.681840 4727 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T12:37:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://670f40668859f037101d1c1bb7a2a2b76377b0ce4a0446b9faf4786b5eb2e8cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T12:37:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T12:38:22Z is after 2025-08-24T17:21:41Z" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.686626 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.686685 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.686697 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.686713 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.686724 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:22Z","lastTransitionTime":"2025-10-01T12:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.789485 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.789924 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.790115 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.790280 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.790414 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:22Z","lastTransitionTime":"2025-10-01T12:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.893328 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.893358 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.893368 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.893381 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.893390 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:22Z","lastTransitionTime":"2025-10-01T12:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.996794 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.996844 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.996855 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.996872 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:22 crc kubenswrapper[4727]: I1001 12:38:22.996883 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:22Z","lastTransitionTime":"2025-10-01T12:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.099934 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.099991 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.100020 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.100038 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.100050 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:23Z","lastTransitionTime":"2025-10-01T12:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.203297 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.203344 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.203368 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.203391 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.203405 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:23Z","lastTransitionTime":"2025-10-01T12:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.306151 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.306217 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.306226 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.306242 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.306250 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:23Z","lastTransitionTime":"2025-10-01T12:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.371849 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.371868 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.371990 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:23 crc kubenswrapper[4727]: E1001 12:38:23.372182 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:23 crc kubenswrapper[4727]: E1001 12:38:23.372305 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:23 crc kubenswrapper[4727]: E1001 12:38:23.372411 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.409399 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.409443 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.409458 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.409475 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.409488 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:23Z","lastTransitionTime":"2025-10-01T12:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.511771 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.511822 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.511834 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.511851 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.511867 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:23Z","lastTransitionTime":"2025-10-01T12:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.614979 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.615062 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.615073 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.615088 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.615113 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:23Z","lastTransitionTime":"2025-10-01T12:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.718304 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.718416 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.718437 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.718456 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.718471 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:23Z","lastTransitionTime":"2025-10-01T12:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.820858 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.820939 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.820962 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.820989 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.821061 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:23Z","lastTransitionTime":"2025-10-01T12:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.924809 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.924869 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.924887 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.924910 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:23 crc kubenswrapper[4727]: I1001 12:38:23.924933 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:23Z","lastTransitionTime":"2025-10-01T12:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.028278 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.028866 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.029114 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.029369 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.029562 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:24Z","lastTransitionTime":"2025-10-01T12:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.133318 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.133798 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.133959 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.134162 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.134313 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:24Z","lastTransitionTime":"2025-10-01T12:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.237567 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.237660 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.237680 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.237706 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.237765 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:24Z","lastTransitionTime":"2025-10-01T12:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.340596 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.340654 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.340674 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.340699 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.340714 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:24Z","lastTransitionTime":"2025-10-01T12:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.371976 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:24 crc kubenswrapper[4727]: E1001 12:38:24.372207 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.387683 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.443328 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.443368 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.443378 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.443391 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.443401 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:24Z","lastTransitionTime":"2025-10-01T12:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.545953 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.546080 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.546109 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.546153 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.546178 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:24Z","lastTransitionTime":"2025-10-01T12:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.648933 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.649257 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.649342 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.649432 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.649524 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:24Z","lastTransitionTime":"2025-10-01T12:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.752501 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.752952 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.753138 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.753288 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.753451 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:24Z","lastTransitionTime":"2025-10-01T12:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.856832 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.856874 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.856885 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.856901 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.856914 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:24Z","lastTransitionTime":"2025-10-01T12:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.960239 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.960323 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.960347 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.960377 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:24 crc kubenswrapper[4727]: I1001 12:38:24.960398 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:24Z","lastTransitionTime":"2025-10-01T12:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.062937 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.062983 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.062992 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.063029 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.063038 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:25Z","lastTransitionTime":"2025-10-01T12:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.166399 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.166878 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.167095 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.167307 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.167442 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:25Z","lastTransitionTime":"2025-10-01T12:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.270191 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.270269 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.270293 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.270321 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.270341 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:25Z","lastTransitionTime":"2025-10-01T12:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.371514 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.371601 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.371632 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:25 crc kubenswrapper[4727]: E1001 12:38:25.371738 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:25 crc kubenswrapper[4727]: E1001 12:38:25.371902 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:25 crc kubenswrapper[4727]: E1001 12:38:25.372520 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.372649 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.372742 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.372761 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.372782 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.372828 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:25Z","lastTransitionTime":"2025-10-01T12:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.475900 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.475963 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.475981 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.476033 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.476053 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:25Z","lastTransitionTime":"2025-10-01T12:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.579283 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.579331 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.579343 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.579360 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.579371 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:25Z","lastTransitionTime":"2025-10-01T12:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.682250 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.682316 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.682333 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.682356 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.682372 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:25Z","lastTransitionTime":"2025-10-01T12:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.786238 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.786285 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.786336 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.786354 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.786730 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:25Z","lastTransitionTime":"2025-10-01T12:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.888796 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.888876 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.888892 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.888916 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.888931 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:25Z","lastTransitionTime":"2025-10-01T12:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.991839 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.991891 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.991904 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.991922 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:25 crc kubenswrapper[4727]: I1001 12:38:25.991933 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:25Z","lastTransitionTime":"2025-10-01T12:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.094482 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.094539 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.094550 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.094573 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.094586 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:26Z","lastTransitionTime":"2025-10-01T12:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.197632 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.197680 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.197692 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.197710 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.197746 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:26Z","lastTransitionTime":"2025-10-01T12:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.300308 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.300365 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.300380 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.300399 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.300413 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:26Z","lastTransitionTime":"2025-10-01T12:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.371422 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:26 crc kubenswrapper[4727]: E1001 12:38:26.371560 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.402456 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.402513 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.402532 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.402557 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.402576 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:26Z","lastTransitionTime":"2025-10-01T12:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.505100 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.505877 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.505932 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.505956 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.505975 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:26Z","lastTransitionTime":"2025-10-01T12:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.608319 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.608382 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.608400 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.608425 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.608443 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:26Z","lastTransitionTime":"2025-10-01T12:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.711754 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.711821 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.711839 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.711863 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.711883 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:26Z","lastTransitionTime":"2025-10-01T12:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.815137 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.815254 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.815295 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.815327 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.815349 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:26Z","lastTransitionTime":"2025-10-01T12:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.918219 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.918281 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.918303 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.918323 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:26 crc kubenswrapper[4727]: I1001 12:38:26.918339 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:26Z","lastTransitionTime":"2025-10-01T12:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.020873 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.020953 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.020970 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.020991 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.021019 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:27Z","lastTransitionTime":"2025-10-01T12:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.124311 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.124376 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.124396 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.124424 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.124444 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:27Z","lastTransitionTime":"2025-10-01T12:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.201316 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:38:27 crc kubenswrapper[4727]: E1001 12:38:27.201571 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:31.201535012 +0000 UTC m=+149.522889889 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.227883 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.227954 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.227974 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.228047 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.228080 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:27Z","lastTransitionTime":"2025-10-01T12:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.302761 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.302845 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.302908 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.302980 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:27 crc kubenswrapper[4727]: E1001 12:38:27.303042 4727 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:38:27 crc kubenswrapper[4727]: E1001 12:38:27.303253 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:38:27 crc kubenswrapper[4727]: E1001 12:38:27.303115 4727 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:38:27 crc kubenswrapper[4727]: E1001 12:38:27.303193 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 12:38:27 crc kubenswrapper[4727]: E1001 12:38:27.303428 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:38:27 crc kubenswrapper[4727]: E1001 12:38:27.303445 4727 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:27 crc kubenswrapper[4727]: E1001 12:38:27.303306 4727 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 12:38:27 crc kubenswrapper[4727]: E1001 12:38:27.303490 4727 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:27 crc kubenswrapper[4727]: E1001 12:38:27.303325 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:39:31.303286349 +0000 UTC m=+149.624641226 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 12:38:27 crc kubenswrapper[4727]: E1001 12:38:27.303561 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 12:39:31.303534717 +0000 UTC m=+149.624889765 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 12:38:27 crc kubenswrapper[4727]: E1001 12:38:27.303584 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 12:39:31.303575949 +0000 UTC m=+149.624931006 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:27 crc kubenswrapper[4727]: E1001 12:38:27.303604 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 12:39:31.303596249 +0000 UTC m=+149.624951326 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.330988 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.331082 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.331119 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.331151 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.331174 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:27Z","lastTransitionTime":"2025-10-01T12:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.371858 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.371943 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.372028 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:27 crc kubenswrapper[4727]: E1001 12:38:27.372041 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:27 crc kubenswrapper[4727]: E1001 12:38:27.372150 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:27 crc kubenswrapper[4727]: E1001 12:38:27.372282 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.434747 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.434799 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.434808 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.434825 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.434838 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:27Z","lastTransitionTime":"2025-10-01T12:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.537986 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.538102 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.538128 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.538164 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.538192 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:27Z","lastTransitionTime":"2025-10-01T12:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.641239 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.641294 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.641305 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.641323 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.641336 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:27Z","lastTransitionTime":"2025-10-01T12:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.745115 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.745168 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.745190 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.745216 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.745236 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:27Z","lastTransitionTime":"2025-10-01T12:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.849054 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.849142 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.849166 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.849201 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.849247 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:27Z","lastTransitionTime":"2025-10-01T12:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.951858 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.951913 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.951924 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.951939 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:27 crc kubenswrapper[4727]: I1001 12:38:27.951951 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:27Z","lastTransitionTime":"2025-10-01T12:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.055687 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.055757 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.055774 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.055799 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.055814 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:28Z","lastTransitionTime":"2025-10-01T12:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.159259 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.159336 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.159356 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.159386 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.159406 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:28Z","lastTransitionTime":"2025-10-01T12:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.262901 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.262942 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.262953 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.262967 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.262978 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:28Z","lastTransitionTime":"2025-10-01T12:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.366952 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.367073 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.367093 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.367152 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.367172 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:28Z","lastTransitionTime":"2025-10-01T12:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.372224 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:28 crc kubenswrapper[4727]: E1001 12:38:28.372467 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.470350 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.470422 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.470437 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.470464 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.470518 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:28Z","lastTransitionTime":"2025-10-01T12:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.573347 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.573403 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.573426 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.573458 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.573482 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:28Z","lastTransitionTime":"2025-10-01T12:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.675233 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.675285 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.675295 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.675310 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.675320 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:28Z","lastTransitionTime":"2025-10-01T12:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.778309 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.778345 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.778501 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.778549 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.778559 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:28Z","lastTransitionTime":"2025-10-01T12:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.881761 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.881807 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.881819 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.881836 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.881847 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:28Z","lastTransitionTime":"2025-10-01T12:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.985084 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.985127 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.985140 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.985159 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:28 crc kubenswrapper[4727]: I1001 12:38:28.985172 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:28Z","lastTransitionTime":"2025-10-01T12:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.088565 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.088626 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.088636 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.088655 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.088667 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:29Z","lastTransitionTime":"2025-10-01T12:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.191160 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.191217 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.191239 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.191257 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.191269 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:29Z","lastTransitionTime":"2025-10-01T12:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.294384 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.294533 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.294569 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.294607 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.294630 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:29Z","lastTransitionTime":"2025-10-01T12:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.371947 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.372070 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.371960 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:29 crc kubenswrapper[4727]: E1001 12:38:29.372183 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:29 crc kubenswrapper[4727]: E1001 12:38:29.372326 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:29 crc kubenswrapper[4727]: E1001 12:38:29.372447 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.398284 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.398343 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.398366 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.398401 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.398427 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:29Z","lastTransitionTime":"2025-10-01T12:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.501298 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.501384 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.501400 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.501418 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.501431 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:29Z","lastTransitionTime":"2025-10-01T12:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.604692 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.604746 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.604764 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.604788 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.604807 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:29Z","lastTransitionTime":"2025-10-01T12:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.708568 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.708688 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.708772 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.708802 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.708819 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:29Z","lastTransitionTime":"2025-10-01T12:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.812028 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.812104 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.812126 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.812201 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.812223 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:29Z","lastTransitionTime":"2025-10-01T12:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.915092 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.915141 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.915152 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.915171 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:29 crc kubenswrapper[4727]: I1001 12:38:29.915188 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:29Z","lastTransitionTime":"2025-10-01T12:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.018743 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.018816 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.018834 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.018855 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.018872 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:30Z","lastTransitionTime":"2025-10-01T12:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.122087 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.122144 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.122171 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.122200 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.122223 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:30Z","lastTransitionTime":"2025-10-01T12:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.225940 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.225985 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.226018 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.226043 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.226069 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:30Z","lastTransitionTime":"2025-10-01T12:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.328791 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.328829 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.328839 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.328853 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.328862 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:30Z","lastTransitionTime":"2025-10-01T12:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.372096 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:30 crc kubenswrapper[4727]: E1001 12:38:30.372301 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.432484 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.432571 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.432593 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.432619 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.432640 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:30Z","lastTransitionTime":"2025-10-01T12:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.536098 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.536176 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.536196 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.536224 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.536242 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:30Z","lastTransitionTime":"2025-10-01T12:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.638985 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.639037 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.639050 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.639065 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.639076 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:30Z","lastTransitionTime":"2025-10-01T12:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.746766 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.746831 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.746853 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.746884 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.746905 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:30Z","lastTransitionTime":"2025-10-01T12:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.849249 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.849298 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.849313 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.849329 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.849341 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:30Z","lastTransitionTime":"2025-10-01T12:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.952098 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.952342 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.952359 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.952381 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:30 crc kubenswrapper[4727]: I1001 12:38:30.952400 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:30Z","lastTransitionTime":"2025-10-01T12:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.055807 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.055861 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.055877 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.055897 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.055910 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:31Z","lastTransitionTime":"2025-10-01T12:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.158897 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.158950 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.158959 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.158977 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.158990 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:31Z","lastTransitionTime":"2025-10-01T12:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.261830 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.261916 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.261941 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.261972 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.262030 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:31Z","lastTransitionTime":"2025-10-01T12:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.365866 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.366043 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.366065 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.366088 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.366106 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:31Z","lastTransitionTime":"2025-10-01T12:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.372230 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:31 crc kubenswrapper[4727]: E1001 12:38:31.372386 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.372486 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.372554 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:31 crc kubenswrapper[4727]: E1001 12:38:31.372807 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:31 crc kubenswrapper[4727]: E1001 12:38:31.372955 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.469988 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.470090 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.470108 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.470133 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.470153 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:31Z","lastTransitionTime":"2025-10-01T12:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.491234 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.491305 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.491328 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.491356 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.491379 4727 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T12:38:31Z","lastTransitionTime":"2025-10-01T12:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.578241 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-st7sw"] Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.578591 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-st7sw" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.582121 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.582600 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.584441 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.584637 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.621078 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=67.621054176 podStartE2EDuration="1m7.621054176s" podCreationTimestamp="2025-10-01 12:37:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:38:31.619926991 +0000 UTC m=+89.941281888" watchObservedRunningTime="2025-10-01 12:38:31.621054176 +0000 UTC m=+89.942409053" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.643162 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=68.643131815 podStartE2EDuration="1m8.643131815s" podCreationTimestamp="2025-10-01 12:37:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:38:31.640501624 +0000 UTC m=+89.961856461" watchObservedRunningTime="2025-10-01 12:38:31.643131815 +0000 UTC m=+89.964486692" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.654904 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e291eede-7301-4701-961d-c5f7719d8698-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-st7sw\" (UID: \"e291eede-7301-4701-961d-c5f7719d8698\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-st7sw" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.654942 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e291eede-7301-4701-961d-c5f7719d8698-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-st7sw\" (UID: \"e291eede-7301-4701-961d-c5f7719d8698\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-st7sw" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.654957 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e291eede-7301-4701-961d-c5f7719d8698-service-ca\") pod \"cluster-version-operator-5c965bbfc6-st7sw\" (UID: \"e291eede-7301-4701-961d-c5f7719d8698\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-st7sw" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.654982 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e291eede-7301-4701-961d-c5f7719d8698-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-st7sw\" (UID: \"e291eede-7301-4701-961d-c5f7719d8698\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-st7sw" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.655018 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e291eede-7301-4701-961d-c5f7719d8698-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-st7sw\" (UID: \"e291eede-7301-4701-961d-c5f7719d8698\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-st7sw" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.680446 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=7.680429771 podStartE2EDuration="7.680429771s" podCreationTimestamp="2025-10-01 12:38:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:38:31.65568345 +0000 UTC m=+89.977038287" watchObservedRunningTime="2025-10-01 12:38:31.680429771 +0000 UTC m=+90.001784618" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.713141 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gfkfd" podStartSLOduration=66.713119006 podStartE2EDuration="1m6.713119006s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:38:31.698960641 +0000 UTC m=+90.020315508" watchObservedRunningTime="2025-10-01 12:38:31.713119006 +0000 UTC m=+90.034473883" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.734414 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=68.73438862 podStartE2EDuration="1m8.73438862s" podCreationTimestamp="2025-10-01 12:37:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:38:31.734035178 +0000 UTC m=+90.055390015" watchObservedRunningTime="2025-10-01 12:38:31.73438862 +0000 UTC m=+90.055743497" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.750542 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=38.750522706 podStartE2EDuration="38.750522706s" podCreationTimestamp="2025-10-01 12:37:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:38:31.7500391 +0000 UTC m=+90.071393997" watchObservedRunningTime="2025-10-01 12:38:31.750522706 +0000 UTC m=+90.071877553" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.756665 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e291eede-7301-4701-961d-c5f7719d8698-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-st7sw\" (UID: \"e291eede-7301-4701-961d-c5f7719d8698\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-st7sw" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.756697 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e291eede-7301-4701-961d-c5f7719d8698-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-st7sw\" (UID: \"e291eede-7301-4701-961d-c5f7719d8698\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-st7sw" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.756730 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e291eede-7301-4701-961d-c5f7719d8698-service-ca\") pod \"cluster-version-operator-5c965bbfc6-st7sw\" (UID: \"e291eede-7301-4701-961d-c5f7719d8698\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-st7sw" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.756764 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e291eede-7301-4701-961d-c5f7719d8698-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-st7sw\" (UID: \"e291eede-7301-4701-961d-c5f7719d8698\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-st7sw" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.756806 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e291eede-7301-4701-961d-c5f7719d8698-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-st7sw\" (UID: \"e291eede-7301-4701-961d-c5f7719d8698\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-st7sw" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.756838 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e291eede-7301-4701-961d-c5f7719d8698-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-st7sw\" (UID: \"e291eede-7301-4701-961d-c5f7719d8698\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-st7sw" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.757464 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e291eede-7301-4701-961d-c5f7719d8698-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-st7sw\" (UID: \"e291eede-7301-4701-961d-c5f7719d8698\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-st7sw" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.757526 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e291eede-7301-4701-961d-c5f7719d8698-service-ca\") pod \"cluster-version-operator-5c965bbfc6-st7sw\" (UID: \"e291eede-7301-4701-961d-c5f7719d8698\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-st7sw" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.772680 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e291eede-7301-4701-961d-c5f7719d8698-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-st7sw\" (UID: \"e291eede-7301-4701-961d-c5f7719d8698\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-st7sw" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.781575 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e291eede-7301-4701-961d-c5f7719d8698-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-st7sw\" (UID: \"e291eede-7301-4701-961d-c5f7719d8698\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-st7sw" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.812145 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-slqxs" podStartSLOduration=66.812120369 podStartE2EDuration="1m6.812120369s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:38:31.811412747 +0000 UTC m=+90.132767584" watchObservedRunningTime="2025-10-01 12:38:31.812120369 +0000 UTC m=+90.133475206" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.895160 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-b9wkt" podStartSLOduration=66.895141651 podStartE2EDuration="1m6.895141651s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:38:31.895082059 +0000 UTC m=+90.216436916" watchObservedRunningTime="2025-10-01 12:38:31.895141651 +0000 UTC m=+90.216496508" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.897138 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-st7sw" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.941043 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-fjlgl" podStartSLOduration=68.941029322 podStartE2EDuration="1m8.941029322s" podCreationTimestamp="2025-10-01 12:37:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:38:31.940798725 +0000 UTC m=+90.262153572" watchObservedRunningTime="2025-10-01 12:38:31.941029322 +0000 UTC m=+90.262384149" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.951326 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podStartSLOduration=66.951301058 podStartE2EDuration="1m6.951301058s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:38:31.951164733 +0000 UTC m=+90.272519580" watchObservedRunningTime="2025-10-01 12:38:31.951301058 +0000 UTC m=+90.272655905" Oct 01 12:38:31 crc kubenswrapper[4727]: I1001 12:38:31.970367 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-nfgjl" podStartSLOduration=66.970350943 podStartE2EDuration="1m6.970350943s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:38:31.970307672 +0000 UTC m=+90.291662529" watchObservedRunningTime="2025-10-01 12:38:31.970350943 +0000 UTC m=+90.291705780" Oct 01 12:38:32 crc kubenswrapper[4727]: I1001 12:38:32.372493 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:32 crc kubenswrapper[4727]: E1001 12:38:32.374253 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:32 crc kubenswrapper[4727]: I1001 12:38:32.906163 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-st7sw" event={"ID":"e291eede-7301-4701-961d-c5f7719d8698","Type":"ContainerStarted","Data":"5e05de17294d2292f5445030ee163868f3af6e081fd271a197db9950747e323d"} Oct 01 12:38:32 crc kubenswrapper[4727]: I1001 12:38:32.906233 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-st7sw" event={"ID":"e291eede-7301-4701-961d-c5f7719d8698","Type":"ContainerStarted","Data":"00a465f553fcbb3433c8ee223da3f39eb72d2a814a223f206b5e573292c5e5bc"} Oct 01 12:38:32 crc kubenswrapper[4727]: I1001 12:38:32.929024 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-st7sw" podStartSLOduration=68.928975541 podStartE2EDuration="1m8.928975541s" podCreationTimestamp="2025-10-01 12:37:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:38:32.928556738 +0000 UTC m=+91.249911615" watchObservedRunningTime="2025-10-01 12:38:32.928975541 +0000 UTC m=+91.250330388" Oct 01 12:38:33 crc kubenswrapper[4727]: I1001 12:38:33.372274 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:33 crc kubenswrapper[4727]: I1001 12:38:33.372360 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:33 crc kubenswrapper[4727]: I1001 12:38:33.372486 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:33 crc kubenswrapper[4727]: E1001 12:38:33.372628 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:33 crc kubenswrapper[4727]: E1001 12:38:33.372778 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:33 crc kubenswrapper[4727]: E1001 12:38:33.373303 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:34 crc kubenswrapper[4727]: I1001 12:38:34.372472 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:34 crc kubenswrapper[4727]: E1001 12:38:34.372665 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:35 crc kubenswrapper[4727]: I1001 12:38:35.372189 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:35 crc kubenswrapper[4727]: I1001 12:38:35.372241 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:35 crc kubenswrapper[4727]: I1001 12:38:35.372290 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:35 crc kubenswrapper[4727]: E1001 12:38:35.373051 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:35 crc kubenswrapper[4727]: I1001 12:38:35.373473 4727 scope.go:117] "RemoveContainer" containerID="69090b800e3d9e3cac2f5bb288478653d9be161e2aa288dc851e6cdd0acd1b57" Oct 01 12:38:35 crc kubenswrapper[4727]: E1001 12:38:35.373751 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pwx55_openshift-ovn-kubernetes(a908511b-2ce2-4a11-8dad-3867bee13f57)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" Oct 01 12:38:35 crc kubenswrapper[4727]: E1001 12:38:35.374185 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:35 crc kubenswrapper[4727]: E1001 12:38:35.374277 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:36 crc kubenswrapper[4727]: I1001 12:38:36.371681 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:36 crc kubenswrapper[4727]: E1001 12:38:36.371866 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:37 crc kubenswrapper[4727]: I1001 12:38:37.372124 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:37 crc kubenswrapper[4727]: I1001 12:38:37.372241 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:37 crc kubenswrapper[4727]: E1001 12:38:37.372317 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:37 crc kubenswrapper[4727]: I1001 12:38:37.372340 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:37 crc kubenswrapper[4727]: E1001 12:38:37.372501 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:37 crc kubenswrapper[4727]: E1001 12:38:37.372733 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:38 crc kubenswrapper[4727]: I1001 12:38:38.372484 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:38 crc kubenswrapper[4727]: E1001 12:38:38.372725 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:39 crc kubenswrapper[4727]: I1001 12:38:39.372210 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:39 crc kubenswrapper[4727]: I1001 12:38:39.372317 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:39 crc kubenswrapper[4727]: I1001 12:38:39.372403 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:39 crc kubenswrapper[4727]: E1001 12:38:39.373783 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:39 crc kubenswrapper[4727]: E1001 12:38:39.373991 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:39 crc kubenswrapper[4727]: E1001 12:38:39.373670 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:40 crc kubenswrapper[4727]: I1001 12:38:40.373390 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:40 crc kubenswrapper[4727]: E1001 12:38:40.373604 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:41 crc kubenswrapper[4727]: I1001 12:38:41.371588 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:41 crc kubenswrapper[4727]: E1001 12:38:41.372159 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:41 crc kubenswrapper[4727]: I1001 12:38:41.371710 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:41 crc kubenswrapper[4727]: I1001 12:38:41.371599 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:41 crc kubenswrapper[4727]: E1001 12:38:41.372324 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:41 crc kubenswrapper[4727]: E1001 12:38:41.372419 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:42 crc kubenswrapper[4727]: I1001 12:38:42.372284 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:42 crc kubenswrapper[4727]: E1001 12:38:42.375279 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:43 crc kubenswrapper[4727]: I1001 12:38:43.372402 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:43 crc kubenswrapper[4727]: I1001 12:38:43.372524 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:43 crc kubenswrapper[4727]: I1001 12:38:43.372582 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:43 crc kubenswrapper[4727]: E1001 12:38:43.373599 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:43 crc kubenswrapper[4727]: E1001 12:38:43.373725 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:43 crc kubenswrapper[4727]: E1001 12:38:43.373894 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:43 crc kubenswrapper[4727]: I1001 12:38:43.695954 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7f4ab8d-5f57-47bd-93fc-9219c596c436-metrics-certs\") pod \"network-metrics-daemon-tvtzh\" (UID: \"f7f4ab8d-5f57-47bd-93fc-9219c596c436\") " pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:43 crc kubenswrapper[4727]: E1001 12:38:43.696203 4727 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:38:43 crc kubenswrapper[4727]: E1001 12:38:43.696320 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7f4ab8d-5f57-47bd-93fc-9219c596c436-metrics-certs podName:f7f4ab8d-5f57-47bd-93fc-9219c596c436 nodeName:}" failed. No retries permitted until 2025-10-01 12:39:47.696283212 +0000 UTC m=+166.017638089 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7f4ab8d-5f57-47bd-93fc-9219c596c436-metrics-certs") pod "network-metrics-daemon-tvtzh" (UID: "f7f4ab8d-5f57-47bd-93fc-9219c596c436") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 12:38:44 crc kubenswrapper[4727]: I1001 12:38:44.372300 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:44 crc kubenswrapper[4727]: E1001 12:38:44.372467 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:45 crc kubenswrapper[4727]: I1001 12:38:45.371374 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:45 crc kubenswrapper[4727]: I1001 12:38:45.371515 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:45 crc kubenswrapper[4727]: E1001 12:38:45.371566 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:45 crc kubenswrapper[4727]: I1001 12:38:45.371654 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:45 crc kubenswrapper[4727]: E1001 12:38:45.371825 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:45 crc kubenswrapper[4727]: E1001 12:38:45.372207 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:46 crc kubenswrapper[4727]: I1001 12:38:46.371608 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:46 crc kubenswrapper[4727]: E1001 12:38:46.371860 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:47 crc kubenswrapper[4727]: I1001 12:38:47.372454 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:47 crc kubenswrapper[4727]: I1001 12:38:47.372594 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:47 crc kubenswrapper[4727]: I1001 12:38:47.372438 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:47 crc kubenswrapper[4727]: E1001 12:38:47.372836 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:47 crc kubenswrapper[4727]: E1001 12:38:47.372884 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:47 crc kubenswrapper[4727]: E1001 12:38:47.372632 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:48 crc kubenswrapper[4727]: I1001 12:38:48.372355 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:48 crc kubenswrapper[4727]: E1001 12:38:48.372518 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:49 crc kubenswrapper[4727]: I1001 12:38:49.371651 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:49 crc kubenswrapper[4727]: I1001 12:38:49.372184 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:49 crc kubenswrapper[4727]: E1001 12:38:49.372487 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:49 crc kubenswrapper[4727]: I1001 12:38:49.372572 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:49 crc kubenswrapper[4727]: E1001 12:38:49.372671 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:49 crc kubenswrapper[4727]: E1001 12:38:49.372910 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:50 crc kubenswrapper[4727]: I1001 12:38:50.372288 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:50 crc kubenswrapper[4727]: I1001 12:38:50.373172 4727 scope.go:117] "RemoveContainer" containerID="69090b800e3d9e3cac2f5bb288478653d9be161e2aa288dc851e6cdd0acd1b57" Oct 01 12:38:50 crc kubenswrapper[4727]: E1001 12:38:50.373236 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:50 crc kubenswrapper[4727]: E1001 12:38:50.373386 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pwx55_openshift-ovn-kubernetes(a908511b-2ce2-4a11-8dad-3867bee13f57)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" Oct 01 12:38:51 crc kubenswrapper[4727]: I1001 12:38:51.371623 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:51 crc kubenswrapper[4727]: E1001 12:38:51.371842 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:51 crc kubenswrapper[4727]: I1001 12:38:51.371933 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:51 crc kubenswrapper[4727]: E1001 12:38:51.372117 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:51 crc kubenswrapper[4727]: I1001 12:38:51.372121 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:51 crc kubenswrapper[4727]: E1001 12:38:51.372679 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:52 crc kubenswrapper[4727]: I1001 12:38:52.371914 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:52 crc kubenswrapper[4727]: E1001 12:38:52.374040 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:53 crc kubenswrapper[4727]: I1001 12:38:53.372098 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:53 crc kubenswrapper[4727]: I1001 12:38:53.372198 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:53 crc kubenswrapper[4727]: I1001 12:38:53.372227 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:53 crc kubenswrapper[4727]: E1001 12:38:53.372272 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:53 crc kubenswrapper[4727]: E1001 12:38:53.372354 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:53 crc kubenswrapper[4727]: E1001 12:38:53.372437 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:54 crc kubenswrapper[4727]: I1001 12:38:54.371810 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:54 crc kubenswrapper[4727]: E1001 12:38:54.373423 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:55 crc kubenswrapper[4727]: I1001 12:38:55.371648 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:55 crc kubenswrapper[4727]: I1001 12:38:55.371699 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:55 crc kubenswrapper[4727]: I1001 12:38:55.371783 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:55 crc kubenswrapper[4727]: E1001 12:38:55.372370 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:55 crc kubenswrapper[4727]: E1001 12:38:55.372613 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:55 crc kubenswrapper[4727]: E1001 12:38:55.372823 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:56 crc kubenswrapper[4727]: I1001 12:38:56.372334 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:56 crc kubenswrapper[4727]: E1001 12:38:56.372558 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:57 crc kubenswrapper[4727]: I1001 12:38:57.372279 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:57 crc kubenswrapper[4727]: I1001 12:38:57.372370 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:57 crc kubenswrapper[4727]: I1001 12:38:57.372433 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:57 crc kubenswrapper[4727]: E1001 12:38:57.372544 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:38:57 crc kubenswrapper[4727]: E1001 12:38:57.372683 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:57 crc kubenswrapper[4727]: E1001 12:38:57.372802 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:58 crc kubenswrapper[4727]: I1001 12:38:58.371966 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:38:58 crc kubenswrapper[4727]: E1001 12:38:58.372174 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:38:59 crc kubenswrapper[4727]: I1001 12:38:59.004566 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-slqxs_5cf1a0b8-9119-44c6-91ea-473317335fb9/kube-multus/1.log" Oct 01 12:38:59 crc kubenswrapper[4727]: I1001 12:38:59.005407 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-slqxs_5cf1a0b8-9119-44c6-91ea-473317335fb9/kube-multus/0.log" Oct 01 12:38:59 crc kubenswrapper[4727]: I1001 12:38:59.005492 4727 generic.go:334] "Generic (PLEG): container finished" podID="5cf1a0b8-9119-44c6-91ea-473317335fb9" containerID="646bb050f901e31d33162aa5191505e91edf58a243c2dac9bf5b84e99bcebe1c" exitCode=1 Oct 01 12:38:59 crc kubenswrapper[4727]: I1001 12:38:59.005556 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-slqxs" event={"ID":"5cf1a0b8-9119-44c6-91ea-473317335fb9","Type":"ContainerDied","Data":"646bb050f901e31d33162aa5191505e91edf58a243c2dac9bf5b84e99bcebe1c"} Oct 01 12:38:59 crc kubenswrapper[4727]: I1001 12:38:59.005606 4727 scope.go:117] "RemoveContainer" containerID="4d03a6f83a93639d9c14c4f26dbb7dbad6eca2c8026dee6d8b460285623917d9" Oct 01 12:38:59 crc kubenswrapper[4727]: I1001 12:38:59.006351 4727 scope.go:117] "RemoveContainer" containerID="646bb050f901e31d33162aa5191505e91edf58a243c2dac9bf5b84e99bcebe1c" Oct 01 12:38:59 crc kubenswrapper[4727]: E1001 12:38:59.006619 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-slqxs_openshift-multus(5cf1a0b8-9119-44c6-91ea-473317335fb9)\"" pod="openshift-multus/multus-slqxs" podUID="5cf1a0b8-9119-44c6-91ea-473317335fb9" Oct 01 12:38:59 crc kubenswrapper[4727]: I1001 12:38:59.372255 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:38:59 crc kubenswrapper[4727]: I1001 12:38:59.372329 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:38:59 crc kubenswrapper[4727]: E1001 12:38:59.372486 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:38:59 crc kubenswrapper[4727]: E1001 12:38:59.372650 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:38:59 crc kubenswrapper[4727]: I1001 12:38:59.373078 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:38:59 crc kubenswrapper[4727]: E1001 12:38:59.373449 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:00 crc kubenswrapper[4727]: I1001 12:39:00.011179 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-slqxs_5cf1a0b8-9119-44c6-91ea-473317335fb9/kube-multus/1.log" Oct 01 12:39:00 crc kubenswrapper[4727]: I1001 12:39:00.371452 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:00 crc kubenswrapper[4727]: E1001 12:39:00.371643 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:01 crc kubenswrapper[4727]: I1001 12:39:01.371363 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:39:01 crc kubenswrapper[4727]: I1001 12:39:01.371418 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:01 crc kubenswrapper[4727]: I1001 12:39:01.371545 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:01 crc kubenswrapper[4727]: E1001 12:39:01.371727 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:39:01 crc kubenswrapper[4727]: E1001 12:39:01.371853 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:01 crc kubenswrapper[4727]: E1001 12:39:01.371956 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:02 crc kubenswrapper[4727]: I1001 12:39:02.371361 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:02 crc kubenswrapper[4727]: E1001 12:39:02.373423 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:02 crc kubenswrapper[4727]: E1001 12:39:02.389337 4727 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 01 12:39:02 crc kubenswrapper[4727]: E1001 12:39:02.475743 4727 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 12:39:03 crc kubenswrapper[4727]: I1001 12:39:03.371833 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:03 crc kubenswrapper[4727]: I1001 12:39:03.371844 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:39:03 crc kubenswrapper[4727]: I1001 12:39:03.371871 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:03 crc kubenswrapper[4727]: E1001 12:39:03.372153 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:03 crc kubenswrapper[4727]: E1001 12:39:03.372277 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:03 crc kubenswrapper[4727]: E1001 12:39:03.372423 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:39:04 crc kubenswrapper[4727]: I1001 12:39:04.372430 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:04 crc kubenswrapper[4727]: E1001 12:39:04.372619 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:04 crc kubenswrapper[4727]: I1001 12:39:04.374052 4727 scope.go:117] "RemoveContainer" containerID="69090b800e3d9e3cac2f5bb288478653d9be161e2aa288dc851e6cdd0acd1b57" Oct 01 12:39:05 crc kubenswrapper[4727]: I1001 12:39:05.031586 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwx55_a908511b-2ce2-4a11-8dad-3867bee13f57/ovnkube-controller/3.log" Oct 01 12:39:05 crc kubenswrapper[4727]: I1001 12:39:05.035647 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" event={"ID":"a908511b-2ce2-4a11-8dad-3867bee13f57","Type":"ContainerStarted","Data":"b2e1765b2828434b1e02dfd7ac7d9dc1358e15d8a1f0f3caba9d3b234e1cd232"} Oct 01 12:39:05 crc kubenswrapper[4727]: I1001 12:39:05.036233 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:39:05 crc kubenswrapper[4727]: I1001 12:39:05.070053 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" podStartSLOduration=100.070036552 podStartE2EDuration="1m40.070036552s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:05.069823065 +0000 UTC m=+123.391177912" watchObservedRunningTime="2025-10-01 12:39:05.070036552 +0000 UTC m=+123.391391389" Oct 01 12:39:05 crc kubenswrapper[4727]: I1001 12:39:05.297532 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tvtzh"] Oct 01 12:39:05 crc kubenswrapper[4727]: I1001 12:39:05.297747 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:39:05 crc kubenswrapper[4727]: E1001 12:39:05.297871 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:39:05 crc kubenswrapper[4727]: I1001 12:39:05.371667 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:05 crc kubenswrapper[4727]: E1001 12:39:05.371834 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:05 crc kubenswrapper[4727]: I1001 12:39:05.372138 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:05 crc kubenswrapper[4727]: E1001 12:39:05.372189 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:06 crc kubenswrapper[4727]: I1001 12:39:06.371776 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:39:06 crc kubenswrapper[4727]: I1001 12:39:06.371845 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:06 crc kubenswrapper[4727]: E1001 12:39:06.372077 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:39:06 crc kubenswrapper[4727]: E1001 12:39:06.372552 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:07 crc kubenswrapper[4727]: I1001 12:39:07.371445 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:07 crc kubenswrapper[4727]: I1001 12:39:07.371522 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:07 crc kubenswrapper[4727]: E1001 12:39:07.371813 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:07 crc kubenswrapper[4727]: E1001 12:39:07.371941 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:07 crc kubenswrapper[4727]: E1001 12:39:07.477303 4727 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 12:39:08 crc kubenswrapper[4727]: I1001 12:39:08.372392 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:39:08 crc kubenswrapper[4727]: I1001 12:39:08.372458 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:08 crc kubenswrapper[4727]: E1001 12:39:08.372664 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:39:08 crc kubenswrapper[4727]: E1001 12:39:08.372806 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:09 crc kubenswrapper[4727]: I1001 12:39:09.372081 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:09 crc kubenswrapper[4727]: I1001 12:39:09.372113 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:09 crc kubenswrapper[4727]: E1001 12:39:09.372368 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:09 crc kubenswrapper[4727]: E1001 12:39:09.372494 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:10 crc kubenswrapper[4727]: I1001 12:39:10.371562 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:39:10 crc kubenswrapper[4727]: E1001 12:39:10.371751 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:39:10 crc kubenswrapper[4727]: I1001 12:39:10.371562 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:10 crc kubenswrapper[4727]: E1001 12:39:10.372223 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:11 crc kubenswrapper[4727]: I1001 12:39:11.372350 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:11 crc kubenswrapper[4727]: I1001 12:39:11.372353 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:11 crc kubenswrapper[4727]: E1001 12:39:11.372540 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:11 crc kubenswrapper[4727]: E1001 12:39:11.372991 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:12 crc kubenswrapper[4727]: I1001 12:39:12.371493 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:39:12 crc kubenswrapper[4727]: I1001 12:39:12.374047 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:12 crc kubenswrapper[4727]: E1001 12:39:12.374571 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:12 crc kubenswrapper[4727]: E1001 12:39:12.374038 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:39:12 crc kubenswrapper[4727]: E1001 12:39:12.478161 4727 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 12:39:13 crc kubenswrapper[4727]: I1001 12:39:13.371338 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:13 crc kubenswrapper[4727]: I1001 12:39:13.372344 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:13 crc kubenswrapper[4727]: E1001 12:39:13.372435 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:13 crc kubenswrapper[4727]: E1001 12:39:13.372477 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:13 crc kubenswrapper[4727]: I1001 12:39:13.372822 4727 scope.go:117] "RemoveContainer" containerID="646bb050f901e31d33162aa5191505e91edf58a243c2dac9bf5b84e99bcebe1c" Oct 01 12:39:14 crc kubenswrapper[4727]: I1001 12:39:14.071468 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-slqxs_5cf1a0b8-9119-44c6-91ea-473317335fb9/kube-multus/1.log" Oct 01 12:39:14 crc kubenswrapper[4727]: I1001 12:39:14.071842 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-slqxs" event={"ID":"5cf1a0b8-9119-44c6-91ea-473317335fb9","Type":"ContainerStarted","Data":"6e30e4ca49faf2e9ac3302ec2021e148f3abddfba6a1d82a337dee7352158388"} Oct 01 12:39:14 crc kubenswrapper[4727]: I1001 12:39:14.372202 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:14 crc kubenswrapper[4727]: I1001 12:39:14.372242 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:39:14 crc kubenswrapper[4727]: E1001 12:39:14.372442 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:14 crc kubenswrapper[4727]: E1001 12:39:14.372653 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:39:15 crc kubenswrapper[4727]: I1001 12:39:15.371613 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:15 crc kubenswrapper[4727]: I1001 12:39:15.371687 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:15 crc kubenswrapper[4727]: E1001 12:39:15.371791 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:15 crc kubenswrapper[4727]: E1001 12:39:15.371849 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:16 crc kubenswrapper[4727]: I1001 12:39:16.372352 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:16 crc kubenswrapper[4727]: I1001 12:39:16.372389 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:39:16 crc kubenswrapper[4727]: E1001 12:39:16.372550 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 12:39:16 crc kubenswrapper[4727]: E1001 12:39:16.372725 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvtzh" podUID="f7f4ab8d-5f57-47bd-93fc-9219c596c436" Oct 01 12:39:17 crc kubenswrapper[4727]: I1001 12:39:17.372040 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:17 crc kubenswrapper[4727]: I1001 12:39:17.371992 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:17 crc kubenswrapper[4727]: E1001 12:39:17.372256 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 12:39:17 crc kubenswrapper[4727]: E1001 12:39:17.372401 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 12:39:18 crc kubenswrapper[4727]: I1001 12:39:18.372468 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:39:18 crc kubenswrapper[4727]: I1001 12:39:18.372599 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:18 crc kubenswrapper[4727]: I1001 12:39:18.376047 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 01 12:39:18 crc kubenswrapper[4727]: I1001 12:39:18.376081 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 01 12:39:18 crc kubenswrapper[4727]: I1001 12:39:18.376654 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 01 12:39:18 crc kubenswrapper[4727]: I1001 12:39:18.377181 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 01 12:39:19 crc kubenswrapper[4727]: I1001 12:39:19.371741 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:19 crc kubenswrapper[4727]: I1001 12:39:19.371779 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:19 crc kubenswrapper[4727]: I1001 12:39:19.375311 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 01 12:39:19 crc kubenswrapper[4727]: I1001 12:39:19.375430 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 01 12:39:21 crc kubenswrapper[4727]: I1001 12:39:21.399491 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.463265 4727 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.517812 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7c8v7"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.518899 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.522621 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.529118 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.529379 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.530298 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.530482 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.530525 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.530908 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.530948 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.531045 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.532184 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.543304 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.544079 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.544730 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.545333 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.546178 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-kt4rr"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.547188 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kt4rr" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.547665 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-psvph"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.548410 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-psvph" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.550148 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.550709 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t8f6l"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.551405 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t8f6l" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.555102 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.555358 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.555480 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.555535 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.555731 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.555898 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.556146 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.555959 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.556419 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.556465 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.556539 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.556586 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.556638 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.556690 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.556910 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.556994 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.557167 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.557303 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.557849 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.558057 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.558196 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.558338 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.558489 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.558614 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.558728 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.558783 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.558957 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.559724 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.561092 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dwszm"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.569755 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.572282 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vbhpd"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584430 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-config\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584474 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/beaa2552-3c16-4136-99e5-50e5eb116f04-serving-cert\") pod \"route-controller-manager-6576b87f9c-w8gvn\" (UID: \"beaa2552-3c16-4136-99e5-50e5eb116f04\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584497 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/423d089e-1cb8-48a0-8673-74d55aebe4f4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-t8f6l\" (UID: \"423d089e-1cb8-48a0-8673-74d55aebe4f4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t8f6l" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584530 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b19dbbbe-8e00-400e-8499-7ebdf954faa2-serving-cert\") pod \"apiserver-7bbb656c7d-nhtfk\" (UID: \"b19dbbbe-8e00-400e-8499-7ebdf954faa2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584550 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b19dbbbe-8e00-400e-8499-7ebdf954faa2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nhtfk\" (UID: \"b19dbbbe-8e00-400e-8499-7ebdf954faa2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584568 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8kl7\" (UniqueName: \"kubernetes.io/projected/e6df7b62-9be2-42bf-a5bb-15dfc389a34b-kube-api-access-z8kl7\") pod \"machine-approver-56656f9798-kt4rr\" (UID: \"e6df7b62-9be2-42bf-a5bb-15dfc389a34b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kt4rr" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584588 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beaa2552-3c16-4136-99e5-50e5eb116f04-config\") pod \"route-controller-manager-6576b87f9c-w8gvn\" (UID: \"beaa2552-3c16-4136-99e5-50e5eb116f04\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584605 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e419dff2-2c6a-4c89-8d99-0374397903b1-images\") pod \"machine-api-operator-5694c8668f-psvph\" (UID: \"e419dff2-2c6a-4c89-8d99-0374397903b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-psvph" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584623 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-etcd-client\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584642 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-encryption-config\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584659 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e419dff2-2c6a-4c89-8d99-0374397903b1-config\") pod \"machine-api-operator-5694c8668f-psvph\" (UID: \"e419dff2-2c6a-4c89-8d99-0374397903b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-psvph" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584672 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e419dff2-2c6a-4c89-8d99-0374397903b1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-psvph\" (UID: \"e419dff2-2c6a-4c89-8d99-0374397903b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-psvph" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584714 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8zgt\" (UniqueName: \"kubernetes.io/projected/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-kube-api-access-x8zgt\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584730 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b19dbbbe-8e00-400e-8499-7ebdf954faa2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nhtfk\" (UID: \"b19dbbbe-8e00-400e-8499-7ebdf954faa2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584746 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/423d089e-1cb8-48a0-8673-74d55aebe4f4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-t8f6l\" (UID: \"423d089e-1cb8-48a0-8673-74d55aebe4f4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t8f6l" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584767 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-serving-cert\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584781 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-image-import-ca\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584796 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b19dbbbe-8e00-400e-8499-7ebdf954faa2-encryption-config\") pod \"apiserver-7bbb656c7d-nhtfk\" (UID: \"b19dbbbe-8e00-400e-8499-7ebdf954faa2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584811 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/423d089e-1cb8-48a0-8673-74d55aebe4f4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-t8f6l\" (UID: \"423d089e-1cb8-48a0-8673-74d55aebe4f4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t8f6l" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584825 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/beaa2552-3c16-4136-99e5-50e5eb116f04-client-ca\") pod \"route-controller-manager-6576b87f9c-w8gvn\" (UID: \"beaa2552-3c16-4136-99e5-50e5eb116f04\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584839 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e6df7b62-9be2-42bf-a5bb-15dfc389a34b-auth-proxy-config\") pod \"machine-approver-56656f9798-kt4rr\" (UID: \"e6df7b62-9be2-42bf-a5bb-15dfc389a34b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kt4rr" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584853 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-audit-dir\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584870 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e6df7b62-9be2-42bf-a5bb-15dfc389a34b-machine-approver-tls\") pod \"machine-approver-56656f9798-kt4rr\" (UID: \"e6df7b62-9be2-42bf-a5bb-15dfc389a34b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kt4rr" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584887 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn7mq\" (UniqueName: \"kubernetes.io/projected/423d089e-1cb8-48a0-8673-74d55aebe4f4-kube-api-access-kn7mq\") pod \"cluster-image-registry-operator-dc59b4c8b-t8f6l\" (UID: \"423d089e-1cb8-48a0-8673-74d55aebe4f4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t8f6l" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584920 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b19dbbbe-8e00-400e-8499-7ebdf954faa2-etcd-client\") pod \"apiserver-7bbb656c7d-nhtfk\" (UID: \"b19dbbbe-8e00-400e-8499-7ebdf954faa2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584940 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6df7b62-9be2-42bf-a5bb-15dfc389a34b-config\") pod \"machine-approver-56656f9798-kt4rr\" (UID: \"e6df7b62-9be2-42bf-a5bb-15dfc389a34b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kt4rr" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584959 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b19dbbbe-8e00-400e-8499-7ebdf954faa2-audit-dir\") pod \"apiserver-7bbb656c7d-nhtfk\" (UID: \"b19dbbbe-8e00-400e-8499-7ebdf954faa2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584974 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.584990 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94nj2\" (UniqueName: \"kubernetes.io/projected/beaa2552-3c16-4136-99e5-50e5eb116f04-kube-api-access-94nj2\") pod \"route-controller-manager-6576b87f9c-w8gvn\" (UID: \"beaa2552-3c16-4136-99e5-50e5eb116f04\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.585040 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-etcd-serving-ca\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.585053 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b19dbbbe-8e00-400e-8499-7ebdf954faa2-audit-policies\") pod \"apiserver-7bbb656c7d-nhtfk\" (UID: \"b19dbbbe-8e00-400e-8499-7ebdf954faa2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.585068 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-node-pullsecrets\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.585081 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-audit\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.585097 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb74f\" (UniqueName: \"kubernetes.io/projected/b19dbbbe-8e00-400e-8499-7ebdf954faa2-kube-api-access-nb74f\") pod \"apiserver-7bbb656c7d-nhtfk\" (UID: \"b19dbbbe-8e00-400e-8499-7ebdf954faa2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.585121 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kptkw\" (UniqueName: \"kubernetes.io/projected/e419dff2-2c6a-4c89-8d99-0374397903b1-kube-api-access-kptkw\") pod \"machine-api-operator-5694c8668f-psvph\" (UID: \"e419dff2-2c6a-4c89-8d99-0374397903b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-psvph" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.585943 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.589150 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fdjkz"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.589267 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vbhpd" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.589580 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-89tz6"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.589643 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fdjkz" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.590528 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-w2j4h"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.590760 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d4gnf"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.591137 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4gnf" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.591445 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.591461 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.591502 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-w2j4h" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.591479 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.591587 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.594930 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zhq2q"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.595318 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8lp9x"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.595556 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-qktm7"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.595721 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.595863 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qktm7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.596480 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.597256 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-p7692"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.597739 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-p7692" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.598160 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hls2x"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.598726 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hls2x" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.598829 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.599014 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.599093 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.599096 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.599239 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.599338 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.599527 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.599696 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.599717 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n87vl"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.599890 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.600040 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n87vl" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.600297 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.601095 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mhnsv"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.601764 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mhnsv" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.601867 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.601984 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.602279 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.605088 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.605355 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.605578 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.605634 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.605701 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djszs"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.606104 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.606365 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.606652 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.606385 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djszs" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.606895 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.606950 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7v5w8"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.607059 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.607719 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.607878 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.608406 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-t4nz6"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.608500 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.609192 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7v5w8" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.609218 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-t4nz6" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.609564 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sz95m"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.610145 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sz95m" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.612399 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-28zwl"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.613116 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-28zwl" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.615376 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6rldk"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.616029 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6rldk" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.638347 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lpgxf"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.639763 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.644884 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.647623 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.652852 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.653021 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h82nm"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.653610 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.653687 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.653732 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h82nm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.653739 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.653931 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.653983 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lpgxf" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.655198 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.655225 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.655452 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.655704 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.655813 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.655924 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.656075 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.657138 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.657329 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.657331 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.657487 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.657628 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.657915 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.658075 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.658172 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.658172 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.658593 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.660988 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.662388 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.663390 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2447c"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.664330 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2447c" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.664502 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nz7c2"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.665411 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.665509 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.665634 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.665728 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.666171 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.668923 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.671287 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-84klz"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.671336 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.671549 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.671694 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nz7c2" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.671824 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.675677 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-84klz" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.676809 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-whs6s"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.678545 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-whs6s" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.683630 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322030-zjx2n"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.686192 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.686361 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkkcc"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.686838 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkkcc" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.686968 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6h55"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.687742 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6h55" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.687794 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8zgt\" (UniqueName: \"kubernetes.io/projected/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-kube-api-access-x8zgt\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.687817 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b19dbbbe-8e00-400e-8499-7ebdf954faa2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nhtfk\" (UID: \"b19dbbbe-8e00-400e-8499-7ebdf954faa2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.687839 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n4vw\" (UniqueName: \"kubernetes.io/projected/358a6cda-2c70-4bf3-847a-8f5417bf13ce-kube-api-access-6n4vw\") pod \"etcd-operator-b45778765-fdjkz\" (UID: \"358a6cda-2c70-4bf3-847a-8f5417bf13ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdjkz" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.687861 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/423d089e-1cb8-48a0-8673-74d55aebe4f4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-t8f6l\" (UID: \"423d089e-1cb8-48a0-8673-74d55aebe4f4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t8f6l" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.687886 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-serving-cert\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.687902 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d15984be-56fb-4ab5-9ede-fcd1bf6aefce-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n87vl\" (UID: \"d15984be-56fb-4ab5-9ede-fcd1bf6aefce\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n87vl" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.687921 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-image-import-ca\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.687936 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b19dbbbe-8e00-400e-8499-7ebdf954faa2-encryption-config\") pod \"apiserver-7bbb656c7d-nhtfk\" (UID: \"b19dbbbe-8e00-400e-8499-7ebdf954faa2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.687952 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d15984be-56fb-4ab5-9ede-fcd1bf6aefce-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n87vl\" (UID: \"d15984be-56fb-4ab5-9ede-fcd1bf6aefce\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n87vl" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.687967 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/366b7e92-ea45-4052-8ddc-9540d534a7ad-oauth-serving-cert\") pod \"console-f9d7485db-89tz6\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.687984 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/423d089e-1cb8-48a0-8673-74d55aebe4f4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-t8f6l\" (UID: \"423d089e-1cb8-48a0-8673-74d55aebe4f4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t8f6l" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688017 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2hbs\" (UniqueName: \"kubernetes.io/projected/e491c69e-2845-4958-8b77-ba6aa4afc6fa-kube-api-access-s2hbs\") pod \"cluster-samples-operator-665b6dd947-mhnsv\" (UID: \"e491c69e-2845-4958-8b77-ba6aa4afc6fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mhnsv" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688036 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/beaa2552-3c16-4136-99e5-50e5eb116f04-client-ca\") pod \"route-controller-manager-6576b87f9c-w8gvn\" (UID: \"beaa2552-3c16-4136-99e5-50e5eb116f04\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688054 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e6df7b62-9be2-42bf-a5bb-15dfc389a34b-auth-proxy-config\") pod \"machine-approver-56656f9798-kt4rr\" (UID: \"e6df7b62-9be2-42bf-a5bb-15dfc389a34b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kt4rr" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688128 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/358a6cda-2c70-4bf3-847a-8f5417bf13ce-serving-cert\") pod \"etcd-operator-b45778765-fdjkz\" (UID: \"358a6cda-2c70-4bf3-847a-8f5417bf13ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdjkz" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688147 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d871c42-cfe9-4f9d-80b3-2ccef1246050-client-ca\") pod \"controller-manager-879f6c89f-zhq2q\" (UID: \"4d871c42-cfe9-4f9d-80b3-2ccef1246050\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688162 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688187 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-audit-dir\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688209 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688228 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/366b7e92-ea45-4052-8ddc-9540d534a7ad-trusted-ca-bundle\") pod \"console-f9d7485db-89tz6\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688248 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/358a6cda-2c70-4bf3-847a-8f5417bf13ce-etcd-client\") pod \"etcd-operator-b45778765-fdjkz\" (UID: \"358a6cda-2c70-4bf3-847a-8f5417bf13ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdjkz" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688269 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d15984be-56fb-4ab5-9ede-fcd1bf6aefce-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n87vl\" (UID: \"d15984be-56fb-4ab5-9ede-fcd1bf6aefce\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n87vl" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688289 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688310 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/366b7e92-ea45-4052-8ddc-9540d534a7ad-service-ca\") pod \"console-f9d7485db-89tz6\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688333 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e6df7b62-9be2-42bf-a5bb-15dfc389a34b-machine-approver-tls\") pod \"machine-approver-56656f9798-kt4rr\" (UID: \"e6df7b62-9be2-42bf-a5bb-15dfc389a34b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kt4rr" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688356 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn7mq\" (UniqueName: \"kubernetes.io/projected/423d089e-1cb8-48a0-8673-74d55aebe4f4-kube-api-access-kn7mq\") pod \"cluster-image-registry-operator-dc59b4c8b-t8f6l\" (UID: \"423d089e-1cb8-48a0-8673-74d55aebe4f4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t8f6l" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688383 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/366b7e92-ea45-4052-8ddc-9540d534a7ad-console-serving-cert\") pod \"console-f9d7485db-89tz6\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688397 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/358a6cda-2c70-4bf3-847a-8f5417bf13ce-etcd-service-ca\") pod \"etcd-operator-b45778765-fdjkz\" (UID: \"358a6cda-2c70-4bf3-847a-8f5417bf13ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdjkz" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688419 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b19dbbbe-8e00-400e-8499-7ebdf954faa2-etcd-client\") pod \"apiserver-7bbb656c7d-nhtfk\" (UID: \"b19dbbbe-8e00-400e-8499-7ebdf954faa2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688438 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6df7b62-9be2-42bf-a5bb-15dfc389a34b-config\") pod \"machine-approver-56656f9798-kt4rr\" (UID: \"e6df7b62-9be2-42bf-a5bb-15dfc389a34b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kt4rr" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688465 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01db707e-986e-4b34-ba57-8f184b7ebcc5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-djszs\" (UID: \"01db707e-986e-4b34-ba57-8f184b7ebcc5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djszs" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688488 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32ddbc86-f1a5-49b8-9418-f02063aa6637-serving-cert\") pod \"console-operator-58897d9998-vbhpd\" (UID: \"32ddbc86-f1a5-49b8-9418-f02063aa6637\") " pod="openshift-console-operator/console-operator-58897d9998-vbhpd" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688507 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d871c42-cfe9-4f9d-80b3-2ccef1246050-serving-cert\") pod \"controller-manager-879f6c89f-zhq2q\" (UID: \"4d871c42-cfe9-4f9d-80b3-2ccef1246050\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688526 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688545 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688571 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b19dbbbe-8e00-400e-8499-7ebdf954faa2-audit-dir\") pod \"apiserver-7bbb656c7d-nhtfk\" (UID: \"b19dbbbe-8e00-400e-8499-7ebdf954faa2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688586 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/314cd705-8127-4d02-b9c2-d2c731733ec3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-hls2x\" (UID: \"314cd705-8127-4d02-b9c2-d2c731733ec3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hls2x" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688602 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688619 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk7pk\" (UniqueName: \"kubernetes.io/projected/ad83fce3-f3c7-42eb-8c3e-1b2d964a6dd2-kube-api-access-jk7pk\") pod \"openshift-apiserver-operator-796bbdcf4f-7v5w8\" (UID: \"ad83fce3-f3c7-42eb-8c3e-1b2d964a6dd2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7v5w8" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688634 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2z7h\" (UniqueName: \"kubernetes.io/projected/393e430f-d192-4a64-a39b-fba4a1b1897e-kube-api-access-f2z7h\") pod \"openshift-controller-manager-operator-756b6f6bc6-w2j4h\" (UID: \"393e430f-d192-4a64-a39b-fba4a1b1897e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-w2j4h" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688673 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67n6k\" (UniqueName: \"kubernetes.io/projected/fe4ee3a0-3756-49f8-88f4-21bc1113845d-kube-api-access-67n6k\") pod \"openshift-config-operator-7777fb866f-d4gnf\" (UID: \"fe4ee3a0-3756-49f8-88f4-21bc1113845d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4gnf" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688688 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01db707e-986e-4b34-ba57-8f184b7ebcc5-config\") pod \"kube-controller-manager-operator-78b949d7b-djszs\" (UID: \"01db707e-986e-4b34-ba57-8f184b7ebcc5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djszs" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688703 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/366b7e92-ea45-4052-8ddc-9540d534a7ad-console-oauth-config\") pod \"console-f9d7485db-89tz6\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688717 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xt8w\" (UniqueName: \"kubernetes.io/projected/366b7e92-ea45-4052-8ddc-9540d534a7ad-kube-api-access-2xt8w\") pod \"console-f9d7485db-89tz6\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688734 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688750 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e491c69e-2845-4958-8b77-ba6aa4afc6fa-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-mhnsv\" (UID: \"e491c69e-2845-4958-8b77-ba6aa4afc6fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mhnsv" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688767 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s79pt\" (UniqueName: \"kubernetes.io/projected/54454532-1909-4aa9-b17e-f244107b202e-kube-api-access-s79pt\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688783 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94nj2\" (UniqueName: \"kubernetes.io/projected/beaa2552-3c16-4136-99e5-50e5eb116f04-kube-api-access-94nj2\") pod \"route-controller-manager-6576b87f9c-w8gvn\" (UID: \"beaa2552-3c16-4136-99e5-50e5eb116f04\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688799 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/358a6cda-2c70-4bf3-847a-8f5417bf13ce-etcd-ca\") pod \"etcd-operator-b45778765-fdjkz\" (UID: \"358a6cda-2c70-4bf3-847a-8f5417bf13ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdjkz" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688814 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/314cd705-8127-4d02-b9c2-d2c731733ec3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-hls2x\" (UID: \"314cd705-8127-4d02-b9c2-d2c731733ec3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hls2x" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688831 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-etcd-serving-ca\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688846 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b19dbbbe-8e00-400e-8499-7ebdf954faa2-audit-policies\") pod \"apiserver-7bbb656c7d-nhtfk\" (UID: \"b19dbbbe-8e00-400e-8499-7ebdf954faa2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688860 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/393e430f-d192-4a64-a39b-fba4a1b1897e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-w2j4h\" (UID: \"393e430f-d192-4a64-a39b-fba4a1b1897e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-w2j4h" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688875 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688898 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-node-pullsecrets\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688919 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-audit\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688937 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb74f\" (UniqueName: \"kubernetes.io/projected/b19dbbbe-8e00-400e-8499-7ebdf954faa2-kube-api-access-nb74f\") pod \"apiserver-7bbb656c7d-nhtfk\" (UID: \"b19dbbbe-8e00-400e-8499-7ebdf954faa2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688952 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htzkd\" (UniqueName: \"kubernetes.io/projected/d774a5b7-171a-47c7-8d71-9497eb856102-kube-api-access-htzkd\") pod \"machine-config-controller-84d6567774-28zwl\" (UID: \"d774a5b7-171a-47c7-8d71-9497eb856102\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-28zwl" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.688980 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/01db707e-986e-4b34-ba57-8f184b7ebcc5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-djszs\" (UID: \"01db707e-986e-4b34-ba57-8f184b7ebcc5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djszs" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689021 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4mtd\" (UniqueName: \"kubernetes.io/projected/13bfb813-f506-4bc0-9296-6a3d756968a7-kube-api-access-b4mtd\") pod \"olm-operator-6b444d44fb-h82nm\" (UID: \"13bfb813-f506-4bc0-9296-6a3d756968a7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h82nm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689036 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d871c42-cfe9-4f9d-80b3-2ccef1246050-config\") pod \"controller-manager-879f6c89f-zhq2q\" (UID: \"4d871c42-cfe9-4f9d-80b3-2ccef1246050\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689051 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/54454532-1909-4aa9-b17e-f244107b202e-audit-policies\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689069 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/fe4ee3a0-3756-49f8-88f4-21bc1113845d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d4gnf\" (UID: \"fe4ee3a0-3756-49f8-88f4-21bc1113845d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4gnf" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689084 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32ddbc86-f1a5-49b8-9418-f02063aa6637-config\") pod \"console-operator-58897d9998-vbhpd\" (UID: \"32ddbc86-f1a5-49b8-9418-f02063aa6637\") " pod="openshift-console-operator/console-operator-58897d9998-vbhpd" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689098 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689118 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kptkw\" (UniqueName: \"kubernetes.io/projected/e419dff2-2c6a-4c89-8d99-0374397903b1-kube-api-access-kptkw\") pod \"machine-api-operator-5694c8668f-psvph\" (UID: \"e419dff2-2c6a-4c89-8d99-0374397903b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-psvph" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689135 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/358a6cda-2c70-4bf3-847a-8f5417bf13ce-config\") pod \"etcd-operator-b45778765-fdjkz\" (UID: \"358a6cda-2c70-4bf3-847a-8f5417bf13ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdjkz" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689162 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nw4p\" (UniqueName: \"kubernetes.io/projected/32ddbc86-f1a5-49b8-9418-f02063aa6637-kube-api-access-2nw4p\") pod \"console-operator-58897d9998-vbhpd\" (UID: \"32ddbc86-f1a5-49b8-9418-f02063aa6637\") " pod="openshift-console-operator/console-operator-58897d9998-vbhpd" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689178 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-config\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689194 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/13bfb813-f506-4bc0-9296-6a3d756968a7-srv-cert\") pod \"olm-operator-6b444d44fb-h82nm\" (UID: \"13bfb813-f506-4bc0-9296-6a3d756968a7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h82nm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689209 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4d871c42-cfe9-4f9d-80b3-2ccef1246050-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zhq2q\" (UID: \"4d871c42-cfe9-4f9d-80b3-2ccef1246050\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689225 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/beaa2552-3c16-4136-99e5-50e5eb116f04-serving-cert\") pod \"route-controller-manager-6576b87f9c-w8gvn\" (UID: \"beaa2552-3c16-4136-99e5-50e5eb116f04\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689240 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad83fce3-f3c7-42eb-8c3e-1b2d964a6dd2-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-7v5w8\" (UID: \"ad83fce3-f3c7-42eb-8c3e-1b2d964a6dd2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7v5w8" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689256 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4qzn\" (UniqueName: \"kubernetes.io/projected/4d871c42-cfe9-4f9d-80b3-2ccef1246050-kube-api-access-q4qzn\") pod \"controller-manager-879f6c89f-zhq2q\" (UID: \"4d871c42-cfe9-4f9d-80b3-2ccef1246050\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689271 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/423d089e-1cb8-48a0-8673-74d55aebe4f4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-t8f6l\" (UID: \"423d089e-1cb8-48a0-8673-74d55aebe4f4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t8f6l" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689294 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689310 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689324 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689341 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d774a5b7-171a-47c7-8d71-9497eb856102-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-28zwl\" (UID: \"d774a5b7-171a-47c7-8d71-9497eb856102\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-28zwl" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689361 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b19dbbbe-8e00-400e-8499-7ebdf954faa2-serving-cert\") pod \"apiserver-7bbb656c7d-nhtfk\" (UID: \"b19dbbbe-8e00-400e-8499-7ebdf954faa2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689381 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b19dbbbe-8e00-400e-8499-7ebdf954faa2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nhtfk\" (UID: \"b19dbbbe-8e00-400e-8499-7ebdf954faa2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689402 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8kl7\" (UniqueName: \"kubernetes.io/projected/e6df7b62-9be2-42bf-a5bb-15dfc389a34b-kube-api-access-z8kl7\") pod \"machine-approver-56656f9798-kt4rr\" (UID: \"e6df7b62-9be2-42bf-a5bb-15dfc389a34b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kt4rr" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689422 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad83fce3-f3c7-42eb-8c3e-1b2d964a6dd2-config\") pod \"openshift-apiserver-operator-796bbdcf4f-7v5w8\" (UID: \"ad83fce3-f3c7-42eb-8c3e-1b2d964a6dd2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7v5w8" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689443 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/366b7e92-ea45-4052-8ddc-9540d534a7ad-console-config\") pod \"console-f9d7485db-89tz6\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689461 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e419dff2-2c6a-4c89-8d99-0374397903b1-images\") pod \"machine-api-operator-5694c8668f-psvph\" (UID: \"e419dff2-2c6a-4c89-8d99-0374397903b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-psvph" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689478 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beaa2552-3c16-4136-99e5-50e5eb116f04-config\") pod \"route-controller-manager-6576b87f9c-w8gvn\" (UID: \"beaa2552-3c16-4136-99e5-50e5eb116f04\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689492 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32ddbc86-f1a5-49b8-9418-f02063aa6637-trusted-ca\") pod \"console-operator-58897d9998-vbhpd\" (UID: \"32ddbc86-f1a5-49b8-9418-f02063aa6637\") " pod="openshift-console-operator/console-operator-58897d9998-vbhpd" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.690246 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-image-import-ca\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.690610 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b19dbbbe-8e00-400e-8499-7ebdf954faa2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nhtfk\" (UID: \"b19dbbbe-8e00-400e-8499-7ebdf954faa2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.691252 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e6df7b62-9be2-42bf-a5bb-15dfc389a34b-auth-proxy-config\") pod \"machine-approver-56656f9798-kt4rr\" (UID: \"e6df7b62-9be2-42bf-a5bb-15dfc389a34b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kt4rr" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.691705 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/423d089e-1cb8-48a0-8673-74d55aebe4f4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-t8f6l\" (UID: \"423d089e-1cb8-48a0-8673-74d55aebe4f4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t8f6l" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.691805 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/beaa2552-3c16-4136-99e5-50e5eb116f04-client-ca\") pod \"route-controller-manager-6576b87f9c-w8gvn\" (UID: \"beaa2552-3c16-4136-99e5-50e5eb116f04\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.691839 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mfjhb"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.692467 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mfjhb" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.692854 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-lkgbs"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.693247 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b19dbbbe-8e00-400e-8499-7ebdf954faa2-audit-policies\") pod \"apiserver-7bbb656c7d-nhtfk\" (UID: \"b19dbbbe-8e00-400e-8499-7ebdf954faa2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.693296 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b19dbbbe-8e00-400e-8499-7ebdf954faa2-audit-dir\") pod \"apiserver-7bbb656c7d-nhtfk\" (UID: \"b19dbbbe-8e00-400e-8499-7ebdf954faa2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.693494 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lkgbs" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.689506 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/393e430f-d192-4a64-a39b-fba4a1b1897e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-w2j4h\" (UID: \"393e430f-d192-4a64-a39b-fba4a1b1897e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-w2j4h" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.694088 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-etcd-client\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.694107 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-encryption-config\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.694127 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe4ee3a0-3756-49f8-88f4-21bc1113845d-serving-cert\") pod \"openshift-config-operator-7777fb866f-d4gnf\" (UID: \"fe4ee3a0-3756-49f8-88f4-21bc1113845d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4gnf" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.694143 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/13bfb813-f506-4bc0-9296-6a3d756968a7-profile-collector-cert\") pod \"olm-operator-6b444d44fb-h82nm\" (UID: \"13bfb813-f506-4bc0-9296-6a3d756968a7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h82nm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.694157 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/54454532-1909-4aa9-b17e-f244107b202e-audit-dir\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.694173 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/314cd705-8127-4d02-b9c2-d2c731733ec3-config\") pod \"kube-apiserver-operator-766d6c64bb-hls2x\" (UID: \"314cd705-8127-4d02-b9c2-d2c731733ec3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hls2x" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.694191 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e419dff2-2c6a-4c89-8d99-0374397903b1-config\") pod \"machine-api-operator-5694c8668f-psvph\" (UID: \"e419dff2-2c6a-4c89-8d99-0374397903b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-psvph" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.694208 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e419dff2-2c6a-4c89-8d99-0374397903b1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-psvph\" (UID: \"e419dff2-2c6a-4c89-8d99-0374397903b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-psvph" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.694225 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d774a5b7-171a-47c7-8d71-9497eb856102-proxy-tls\") pod \"machine-config-controller-84d6567774-28zwl\" (UID: \"d774a5b7-171a-47c7-8d71-9497eb856102\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-28zwl" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.694241 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvftf\" (UniqueName: \"kubernetes.io/projected/286216d2-0a22-42ea-bbc7-40fbe51a6f98-kube-api-access-xvftf\") pod \"migrator-59844c95c7-t4nz6\" (UID: \"286216d2-0a22-42ea-bbc7-40fbe51a6f98\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-t4nz6" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.694336 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-node-pullsecrets\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.694767 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-audit\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.694864 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qhnkw"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.695556 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-config\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.716207 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.717342 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b19dbbbe-8e00-400e-8499-7ebdf954faa2-encryption-config\") pod \"apiserver-7bbb656c7d-nhtfk\" (UID: \"b19dbbbe-8e00-400e-8499-7ebdf954faa2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.717867 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e6df7b62-9be2-42bf-a5bb-15dfc389a34b-machine-approver-tls\") pod \"machine-approver-56656f9798-kt4rr\" (UID: \"e6df7b62-9be2-42bf-a5bb-15dfc389a34b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kt4rr" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.692868 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-audit-dir\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.718171 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6df7b62-9be2-42bf-a5bb-15dfc389a34b-config\") pod \"machine-approver-56656f9798-kt4rr\" (UID: \"e6df7b62-9be2-42bf-a5bb-15dfc389a34b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kt4rr" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.718853 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e419dff2-2c6a-4c89-8d99-0374397903b1-images\") pod \"machine-api-operator-5694c8668f-psvph\" (UID: \"e419dff2-2c6a-4c89-8d99-0374397903b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-psvph" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.719833 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-serving-cert\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.722807 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b19dbbbe-8e00-400e-8499-7ebdf954faa2-serving-cert\") pod \"apiserver-7bbb656c7d-nhtfk\" (UID: \"b19dbbbe-8e00-400e-8499-7ebdf954faa2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.723783 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.724542 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-encryption-config\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.724739 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/423d089e-1cb8-48a0-8673-74d55aebe4f4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-t8f6l\" (UID: \"423d089e-1cb8-48a0-8673-74d55aebe4f4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t8f6l" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.725193 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b19dbbbe-8e00-400e-8499-7ebdf954faa2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nhtfk\" (UID: \"b19dbbbe-8e00-400e-8499-7ebdf954faa2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.725282 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.725901 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-etcd-serving-ca\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.726619 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e419dff2-2c6a-4c89-8d99-0374397903b1-config\") pod \"machine-api-operator-5694c8668f-psvph\" (UID: \"e419dff2-2c6a-4c89-8d99-0374397903b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-psvph" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.726783 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-zjx2n" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.727154 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e419dff2-2c6a-4c89-8d99-0374397903b1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-psvph\" (UID: \"e419dff2-2c6a-4c89-8d99-0374397903b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-psvph" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.727220 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qhnkw" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.727823 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beaa2552-3c16-4136-99e5-50e5eb116f04-config\") pod \"route-controller-manager-6576b87f9c-w8gvn\" (UID: \"beaa2552-3c16-4136-99e5-50e5eb116f04\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.728200 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.728223 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/beaa2552-3c16-4136-99e5-50e5eb116f04-serving-cert\") pod \"route-controller-manager-6576b87f9c-w8gvn\" (UID: \"beaa2552-3c16-4136-99e5-50e5eb116f04\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.733376 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b19dbbbe-8e00-400e-8499-7ebdf954faa2-etcd-client\") pod \"apiserver-7bbb656c7d-nhtfk\" (UID: \"b19dbbbe-8e00-400e-8499-7ebdf954faa2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.735751 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-8x58q"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.739874 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7c8v7"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.740491 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8x58q" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.741483 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.742289 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-etcd-client\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.744284 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.746769 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.747224 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t8f6l"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.748209 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-psvph"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.749827 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vbhpd"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.750913 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-cfmpf"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.752062 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cfmpf" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.752353 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-w2j4h"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.753575 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d4gnf"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.755066 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-p7692"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.756484 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dwszm"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.759247 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fdjkz"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.760972 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7v5w8"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.761790 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.762252 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mhnsv"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.763173 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sz95m"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.767611 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-28zwl"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.767650 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6h55"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.767805 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-89tz6"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.769129 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n87vl"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.772188 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2447c"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.772269 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8lp9x"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.775485 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nz7c2"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.776751 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hls2x"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.777597 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-lkgbs"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.778816 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qktm7"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.780195 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322030-zjx2n"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.781491 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djszs"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.781970 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.782089 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zhq2q"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.782977 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkkcc"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.783925 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-t4nz6"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.784921 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lpgxf"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.786141 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h82nm"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.793380 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qhnkw"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.794243 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6rldk"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.794810 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2hbs\" (UniqueName: \"kubernetes.io/projected/e491c69e-2845-4958-8b77-ba6aa4afc6fa-kube-api-access-s2hbs\") pod \"cluster-samples-operator-665b6dd947-mhnsv\" (UID: \"e491c69e-2845-4958-8b77-ba6aa4afc6fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mhnsv" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.794842 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/358a6cda-2c70-4bf3-847a-8f5417bf13ce-serving-cert\") pod \"etcd-operator-b45778765-fdjkz\" (UID: \"358a6cda-2c70-4bf3-847a-8f5417bf13ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdjkz" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.794861 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d871c42-cfe9-4f9d-80b3-2ccef1246050-client-ca\") pod \"controller-manager-879f6c89f-zhq2q\" (UID: \"4d871c42-cfe9-4f9d-80b3-2ccef1246050\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.794882 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.794899 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.794917 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/366b7e92-ea45-4052-8ddc-9540d534a7ad-trusted-ca-bundle\") pod \"console-f9d7485db-89tz6\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.794934 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/358a6cda-2c70-4bf3-847a-8f5417bf13ce-etcd-client\") pod \"etcd-operator-b45778765-fdjkz\" (UID: \"358a6cda-2c70-4bf3-847a-8f5417bf13ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdjkz" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.794949 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d15984be-56fb-4ab5-9ede-fcd1bf6aefce-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n87vl\" (UID: \"d15984be-56fb-4ab5-9ede-fcd1bf6aefce\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n87vl" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.794964 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.794982 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/366b7e92-ea45-4052-8ddc-9540d534a7ad-service-ca\") pod \"console-f9d7485db-89tz6\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795026 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/366b7e92-ea45-4052-8ddc-9540d534a7ad-console-serving-cert\") pod \"console-f9d7485db-89tz6\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795042 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/358a6cda-2c70-4bf3-847a-8f5417bf13ce-etcd-service-ca\") pod \"etcd-operator-b45778765-fdjkz\" (UID: \"358a6cda-2c70-4bf3-847a-8f5417bf13ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdjkz" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795065 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01db707e-986e-4b34-ba57-8f184b7ebcc5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-djszs\" (UID: \"01db707e-986e-4b34-ba57-8f184b7ebcc5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djszs" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795082 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32ddbc86-f1a5-49b8-9418-f02063aa6637-serving-cert\") pod \"console-operator-58897d9998-vbhpd\" (UID: \"32ddbc86-f1a5-49b8-9418-f02063aa6637\") " pod="openshift-console-operator/console-operator-58897d9998-vbhpd" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795099 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d871c42-cfe9-4f9d-80b3-2ccef1246050-serving-cert\") pod \"controller-manager-879f6c89f-zhq2q\" (UID: \"4d871c42-cfe9-4f9d-80b3-2ccef1246050\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795117 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795135 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795158 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/314cd705-8127-4d02-b9c2-d2c731733ec3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-hls2x\" (UID: \"314cd705-8127-4d02-b9c2-d2c731733ec3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hls2x" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795174 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795190 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk7pk\" (UniqueName: \"kubernetes.io/projected/ad83fce3-f3c7-42eb-8c3e-1b2d964a6dd2-kube-api-access-jk7pk\") pod \"openshift-apiserver-operator-796bbdcf4f-7v5w8\" (UID: \"ad83fce3-f3c7-42eb-8c3e-1b2d964a6dd2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7v5w8" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795208 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2z7h\" (UniqueName: \"kubernetes.io/projected/393e430f-d192-4a64-a39b-fba4a1b1897e-kube-api-access-f2z7h\") pod \"openshift-controller-manager-operator-756b6f6bc6-w2j4h\" (UID: \"393e430f-d192-4a64-a39b-fba4a1b1897e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-w2j4h" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795245 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67n6k\" (UniqueName: \"kubernetes.io/projected/fe4ee3a0-3756-49f8-88f4-21bc1113845d-kube-api-access-67n6k\") pod \"openshift-config-operator-7777fb866f-d4gnf\" (UID: \"fe4ee3a0-3756-49f8-88f4-21bc1113845d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4gnf" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795261 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01db707e-986e-4b34-ba57-8f184b7ebcc5-config\") pod \"kube-controller-manager-operator-78b949d7b-djszs\" (UID: \"01db707e-986e-4b34-ba57-8f184b7ebcc5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djszs" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795277 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/366b7e92-ea45-4052-8ddc-9540d534a7ad-console-oauth-config\") pod \"console-f9d7485db-89tz6\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795293 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xt8w\" (UniqueName: \"kubernetes.io/projected/366b7e92-ea45-4052-8ddc-9540d534a7ad-kube-api-access-2xt8w\") pod \"console-f9d7485db-89tz6\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795309 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e491c69e-2845-4958-8b77-ba6aa4afc6fa-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-mhnsv\" (UID: \"e491c69e-2845-4958-8b77-ba6aa4afc6fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mhnsv" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795326 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s79pt\" (UniqueName: \"kubernetes.io/projected/54454532-1909-4aa9-b17e-f244107b202e-kube-api-access-s79pt\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795347 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/358a6cda-2c70-4bf3-847a-8f5417bf13ce-etcd-ca\") pod \"etcd-operator-b45778765-fdjkz\" (UID: \"358a6cda-2c70-4bf3-847a-8f5417bf13ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdjkz" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795366 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/314cd705-8127-4d02-b9c2-d2c731733ec3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-hls2x\" (UID: \"314cd705-8127-4d02-b9c2-d2c731733ec3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hls2x" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795384 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/393e430f-d192-4a64-a39b-fba4a1b1897e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-w2j4h\" (UID: \"393e430f-d192-4a64-a39b-fba4a1b1897e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-w2j4h" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795401 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795422 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htzkd\" (UniqueName: \"kubernetes.io/projected/d774a5b7-171a-47c7-8d71-9497eb856102-kube-api-access-htzkd\") pod \"machine-config-controller-84d6567774-28zwl\" (UID: \"d774a5b7-171a-47c7-8d71-9497eb856102\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-28zwl" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795444 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/01db707e-986e-4b34-ba57-8f184b7ebcc5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-djszs\" (UID: \"01db707e-986e-4b34-ba57-8f184b7ebcc5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djszs" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795460 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4mtd\" (UniqueName: \"kubernetes.io/projected/13bfb813-f506-4bc0-9296-6a3d756968a7-kube-api-access-b4mtd\") pod \"olm-operator-6b444d44fb-h82nm\" (UID: \"13bfb813-f506-4bc0-9296-6a3d756968a7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h82nm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795492 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d871c42-cfe9-4f9d-80b3-2ccef1246050-config\") pod \"controller-manager-879f6c89f-zhq2q\" (UID: \"4d871c42-cfe9-4f9d-80b3-2ccef1246050\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795515 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/54454532-1909-4aa9-b17e-f244107b202e-audit-policies\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795532 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/fe4ee3a0-3756-49f8-88f4-21bc1113845d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d4gnf\" (UID: \"fe4ee3a0-3756-49f8-88f4-21bc1113845d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4gnf" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795549 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32ddbc86-f1a5-49b8-9418-f02063aa6637-config\") pod \"console-operator-58897d9998-vbhpd\" (UID: \"32ddbc86-f1a5-49b8-9418-f02063aa6637\") " pod="openshift-console-operator/console-operator-58897d9998-vbhpd" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795566 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795588 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/358a6cda-2c70-4bf3-847a-8f5417bf13ce-config\") pod \"etcd-operator-b45778765-fdjkz\" (UID: \"358a6cda-2c70-4bf3-847a-8f5417bf13ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdjkz" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795605 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nw4p\" (UniqueName: \"kubernetes.io/projected/32ddbc86-f1a5-49b8-9418-f02063aa6637-kube-api-access-2nw4p\") pod \"console-operator-58897d9998-vbhpd\" (UID: \"32ddbc86-f1a5-49b8-9418-f02063aa6637\") " pod="openshift-console-operator/console-operator-58897d9998-vbhpd" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795622 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/13bfb813-f506-4bc0-9296-6a3d756968a7-srv-cert\") pod \"olm-operator-6b444d44fb-h82nm\" (UID: \"13bfb813-f506-4bc0-9296-6a3d756968a7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h82nm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795774 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.795637 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4d871c42-cfe9-4f9d-80b3-2ccef1246050-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zhq2q\" (UID: \"4d871c42-cfe9-4f9d-80b3-2ccef1246050\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.796269 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad83fce3-f3c7-42eb-8c3e-1b2d964a6dd2-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-7v5w8\" (UID: \"ad83fce3-f3c7-42eb-8c3e-1b2d964a6dd2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7v5w8" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.796288 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4qzn\" (UniqueName: \"kubernetes.io/projected/4d871c42-cfe9-4f9d-80b3-2ccef1246050-kube-api-access-q4qzn\") pod \"controller-manager-879f6c89f-zhq2q\" (UID: \"4d871c42-cfe9-4f9d-80b3-2ccef1246050\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.796317 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.796335 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.796379 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.796402 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d774a5b7-171a-47c7-8d71-9497eb856102-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-28zwl\" (UID: \"d774a5b7-171a-47c7-8d71-9497eb856102\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-28zwl" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.796436 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad83fce3-f3c7-42eb-8c3e-1b2d964a6dd2-config\") pod \"openshift-apiserver-operator-796bbdcf4f-7v5w8\" (UID: \"ad83fce3-f3c7-42eb-8c3e-1b2d964a6dd2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7v5w8" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.796452 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/366b7e92-ea45-4052-8ddc-9540d534a7ad-console-config\") pod \"console-f9d7485db-89tz6\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.796469 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32ddbc86-f1a5-49b8-9418-f02063aa6637-trusted-ca\") pod \"console-operator-58897d9998-vbhpd\" (UID: \"32ddbc86-f1a5-49b8-9418-f02063aa6637\") " pod="openshift-console-operator/console-operator-58897d9998-vbhpd" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.796485 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/393e430f-d192-4a64-a39b-fba4a1b1897e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-w2j4h\" (UID: \"393e430f-d192-4a64-a39b-fba4a1b1897e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-w2j4h" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.796502 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe4ee3a0-3756-49f8-88f4-21bc1113845d-serving-cert\") pod \"openshift-config-operator-7777fb866f-d4gnf\" (UID: \"fe4ee3a0-3756-49f8-88f4-21bc1113845d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4gnf" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.796518 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/13bfb813-f506-4bc0-9296-6a3d756968a7-profile-collector-cert\") pod \"olm-operator-6b444d44fb-h82nm\" (UID: \"13bfb813-f506-4bc0-9296-6a3d756968a7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h82nm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.796538 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.796533 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/54454532-1909-4aa9-b17e-f244107b202e-audit-dir\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.796591 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/54454532-1909-4aa9-b17e-f244107b202e-audit-dir\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.796601 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/314cd705-8127-4d02-b9c2-d2c731733ec3-config\") pod \"kube-apiserver-operator-766d6c64bb-hls2x\" (UID: \"314cd705-8127-4d02-b9c2-d2c731733ec3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hls2x" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.796647 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d774a5b7-171a-47c7-8d71-9497eb856102-proxy-tls\") pod \"machine-config-controller-84d6567774-28zwl\" (UID: \"d774a5b7-171a-47c7-8d71-9497eb856102\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-28zwl" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.796667 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvftf\" (UniqueName: \"kubernetes.io/projected/286216d2-0a22-42ea-bbc7-40fbe51a6f98-kube-api-access-xvftf\") pod \"migrator-59844c95c7-t4nz6\" (UID: \"286216d2-0a22-42ea-bbc7-40fbe51a6f98\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-t4nz6" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.796695 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n4vw\" (UniqueName: \"kubernetes.io/projected/358a6cda-2c70-4bf3-847a-8f5417bf13ce-kube-api-access-6n4vw\") pod \"etcd-operator-b45778765-fdjkz\" (UID: \"358a6cda-2c70-4bf3-847a-8f5417bf13ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdjkz" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.796738 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d15984be-56fb-4ab5-9ede-fcd1bf6aefce-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n87vl\" (UID: \"d15984be-56fb-4ab5-9ede-fcd1bf6aefce\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n87vl" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.796758 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d15984be-56fb-4ab5-9ede-fcd1bf6aefce-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n87vl\" (UID: \"d15984be-56fb-4ab5-9ede-fcd1bf6aefce\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n87vl" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.796774 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/366b7e92-ea45-4052-8ddc-9540d534a7ad-oauth-serving-cert\") pod \"console-f9d7485db-89tz6\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.796841 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/366b7e92-ea45-4052-8ddc-9540d534a7ad-service-ca\") pod \"console-f9d7485db-89tz6\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.796934 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d871c42-cfe9-4f9d-80b3-2ccef1246050-client-ca\") pod \"controller-manager-879f6c89f-zhq2q\" (UID: \"4d871c42-cfe9-4f9d-80b3-2ccef1246050\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.797175 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/366b7e92-ea45-4052-8ddc-9540d534a7ad-trusted-ca-bundle\") pod \"console-f9d7485db-89tz6\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.797316 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/358a6cda-2c70-4bf3-847a-8f5417bf13ce-etcd-service-ca\") pod \"etcd-operator-b45778765-fdjkz\" (UID: \"358a6cda-2c70-4bf3-847a-8f5417bf13ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdjkz" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.797537 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/366b7e92-ea45-4052-8ddc-9540d534a7ad-oauth-serving-cert\") pod \"console-f9d7485db-89tz6\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.797550 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/314cd705-8127-4d02-b9c2-d2c731733ec3-config\") pod \"kube-apiserver-operator-766d6c64bb-hls2x\" (UID: \"314cd705-8127-4d02-b9c2-d2c731733ec3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hls2x" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.798188 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/366b7e92-ea45-4052-8ddc-9540d534a7ad-console-config\") pod \"console-f9d7485db-89tz6\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.798262 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/358a6cda-2c70-4bf3-847a-8f5417bf13ce-config\") pod \"etcd-operator-b45778765-fdjkz\" (UID: \"358a6cda-2c70-4bf3-847a-8f5417bf13ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdjkz" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.798477 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d871c42-cfe9-4f9d-80b3-2ccef1246050-config\") pod \"controller-manager-879f6c89f-zhq2q\" (UID: \"4d871c42-cfe9-4f9d-80b3-2ccef1246050\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.798638 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/358a6cda-2c70-4bf3-847a-8f5417bf13ce-etcd-client\") pod \"etcd-operator-b45778765-fdjkz\" (UID: \"358a6cda-2c70-4bf3-847a-8f5417bf13ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdjkz" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.798728 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/54454532-1909-4aa9-b17e-f244107b202e-audit-policies\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.799113 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/fe4ee3a0-3756-49f8-88f4-21bc1113845d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d4gnf\" (UID: \"fe4ee3a0-3756-49f8-88f4-21bc1113845d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4gnf" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.799473 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4d871c42-cfe9-4f9d-80b3-2ccef1246050-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zhq2q\" (UID: \"4d871c42-cfe9-4f9d-80b3-2ccef1246050\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.799748 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d774a5b7-171a-47c7-8d71-9497eb856102-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-28zwl\" (UID: \"d774a5b7-171a-47c7-8d71-9497eb856102\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-28zwl" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.799783 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32ddbc86-f1a5-49b8-9418-f02063aa6637-config\") pod \"console-operator-58897d9998-vbhpd\" (UID: \"32ddbc86-f1a5-49b8-9418-f02063aa6637\") " pod="openshift-console-operator/console-operator-58897d9998-vbhpd" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.800392 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/393e430f-d192-4a64-a39b-fba4a1b1897e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-w2j4h\" (UID: \"393e430f-d192-4a64-a39b-fba4a1b1897e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-w2j4h" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.800392 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/358a6cda-2c70-4bf3-847a-8f5417bf13ce-serving-cert\") pod \"etcd-operator-b45778765-fdjkz\" (UID: \"358a6cda-2c70-4bf3-847a-8f5417bf13ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdjkz" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.800469 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mfjhb"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.800496 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-84klz"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.800692 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-whs6s"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.800826 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.800858 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.801099 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.801397 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.801632 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/366b7e92-ea45-4052-8ddc-9540d534a7ad-console-serving-cert\") pod \"console-f9d7485db-89tz6\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.802057 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.802555 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32ddbc86-f1a5-49b8-9418-f02063aa6637-trusted-ca\") pod \"console-operator-58897d9998-vbhpd\" (UID: \"32ddbc86-f1a5-49b8-9418-f02063aa6637\") " pod="openshift-console-operator/console-operator-58897d9998-vbhpd" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.803137 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.803421 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/358a6cda-2c70-4bf3-847a-8f5417bf13ce-etcd-ca\") pod \"etcd-operator-b45778765-fdjkz\" (UID: \"358a6cda-2c70-4bf3-847a-8f5417bf13ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdjkz" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.803754 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe4ee3a0-3756-49f8-88f4-21bc1113845d-serving-cert\") pod \"openshift-config-operator-7777fb866f-d4gnf\" (UID: \"fe4ee3a0-3756-49f8-88f4-21bc1113845d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4gnf" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.804127 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.804292 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tcfrz"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.805219 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/393e430f-d192-4a64-a39b-fba4a1b1897e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-w2j4h\" (UID: \"393e430f-d192-4a64-a39b-fba4a1b1897e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-w2j4h" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.805352 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.805371 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/314cd705-8127-4d02-b9c2-d2c731733ec3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-hls2x\" (UID: \"314cd705-8127-4d02-b9c2-d2c731733ec3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hls2x" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.805401 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.805599 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-cxgjh"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.805681 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.805853 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tcfrz" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.805972 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-cxgjh" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.806162 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32ddbc86-f1a5-49b8-9418-f02063aa6637-serving-cert\") pod \"console-operator-58897d9998-vbhpd\" (UID: \"32ddbc86-f1a5-49b8-9418-f02063aa6637\") " pod="openshift-console-operator/console-operator-58897d9998-vbhpd" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.806825 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tcfrz"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.807594 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d871c42-cfe9-4f9d-80b3-2ccef1246050-serving-cert\") pod \"controller-manager-879f6c89f-zhq2q\" (UID: \"4d871c42-cfe9-4f9d-80b3-2ccef1246050\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.807603 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/366b7e92-ea45-4052-8ddc-9540d534a7ad-console-oauth-config\") pod \"console-f9d7485db-89tz6\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.808031 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cfmpf"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.809397 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-42q5h"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.810862 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-42q5h" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.811973 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d15984be-56fb-4ab5-9ede-fcd1bf6aefce-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n87vl\" (UID: \"d15984be-56fb-4ab5-9ede-fcd1bf6aefce\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n87vl" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.818100 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-42q5h"] Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.821551 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.828733 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d15984be-56fb-4ab5-9ede-fcd1bf6aefce-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n87vl\" (UID: \"d15984be-56fb-4ab5-9ede-fcd1bf6aefce\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n87vl" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.840937 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.861148 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.881768 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.886351 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e491c69e-2845-4958-8b77-ba6aa4afc6fa-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-mhnsv\" (UID: \"e491c69e-2845-4958-8b77-ba6aa4afc6fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mhnsv" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.903815 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.922402 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.941286 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.961532 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.978593 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01db707e-986e-4b34-ba57-8f184b7ebcc5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-djszs\" (UID: \"01db707e-986e-4b34-ba57-8f184b7ebcc5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djszs" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.982143 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 01 12:39:22 crc kubenswrapper[4727]: I1001 12:39:22.986324 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01db707e-986e-4b34-ba57-8f184b7ebcc5-config\") pod \"kube-controller-manager-operator-78b949d7b-djszs\" (UID: \"01db707e-986e-4b34-ba57-8f184b7ebcc5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djszs" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.001149 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.021772 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.041539 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.062030 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.081577 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.092623 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad83fce3-f3c7-42eb-8c3e-1b2d964a6dd2-config\") pod \"openshift-apiserver-operator-796bbdcf4f-7v5w8\" (UID: \"ad83fce3-f3c7-42eb-8c3e-1b2d964a6dd2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7v5w8" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.102024 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.114694 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad83fce3-f3c7-42eb-8c3e-1b2d964a6dd2-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-7v5w8\" (UID: \"ad83fce3-f3c7-42eb-8c3e-1b2d964a6dd2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7v5w8" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.121377 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.142721 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.161908 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.182121 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.202130 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.222440 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.241674 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.271929 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.280983 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.292348 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d774a5b7-171a-47c7-8d71-9497eb856102-proxy-tls\") pod \"machine-config-controller-84d6567774-28zwl\" (UID: \"d774a5b7-171a-47c7-8d71-9497eb856102\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-28zwl" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.302133 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.322078 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.342953 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.362492 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.381913 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.401405 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.422438 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.441984 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.453133 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/13bfb813-f506-4bc0-9296-6a3d756968a7-srv-cert\") pod \"olm-operator-6b444d44fb-h82nm\" (UID: \"13bfb813-f506-4bc0-9296-6a3d756968a7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h82nm" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.461915 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.481652 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.496105 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/13bfb813-f506-4bc0-9296-6a3d756968a7-profile-collector-cert\") pod \"olm-operator-6b444d44fb-h82nm\" (UID: \"13bfb813-f506-4bc0-9296-6a3d756968a7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h82nm" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.501755 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.522139 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.542574 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.579553 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.581048 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.642589 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.661590 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.679903 4727 request.go:700] Waited for 1.015298934s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmco-proxy-tls&limit=500&resourceVersion=0 Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.682029 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.701906 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.721209 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.741853 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.761264 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.782243 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.801855 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.821602 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.841743 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.861467 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.881098 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.915402 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8zgt\" (UniqueName: \"kubernetes.io/projected/2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d-kube-api-access-x8zgt\") pod \"apiserver-76f77b778f-7c8v7\" (UID: \"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d\") " pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.921229 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.941480 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.976582 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/423d089e-1cb8-48a0-8673-74d55aebe4f4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-t8f6l\" (UID: \"423d089e-1cb8-48a0-8673-74d55aebe4f4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t8f6l" Oct 01 12:39:23 crc kubenswrapper[4727]: I1001 12:39:23.982464 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.039959 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb74f\" (UniqueName: \"kubernetes.io/projected/b19dbbbe-8e00-400e-8499-7ebdf954faa2-kube-api-access-nb74f\") pod \"apiserver-7bbb656c7d-nhtfk\" (UID: \"b19dbbbe-8e00-400e-8499-7ebdf954faa2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.041625 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.048544 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kptkw\" (UniqueName: \"kubernetes.io/projected/e419dff2-2c6a-4c89-8d99-0374397903b1-kube-api-access-kptkw\") pod \"machine-api-operator-5694c8668f-psvph\" (UID: \"e419dff2-2c6a-4c89-8d99-0374397903b1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-psvph" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.059055 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn7mq\" (UniqueName: \"kubernetes.io/projected/423d089e-1cb8-48a0-8673-74d55aebe4f4-kube-api-access-kn7mq\") pod \"cluster-image-registry-operator-dc59b4c8b-t8f6l\" (UID: \"423d089e-1cb8-48a0-8673-74d55aebe4f4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t8f6l" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.061842 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.072911 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.081479 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.102495 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.122476 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.164377 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-psvph" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.172080 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8kl7\" (UniqueName: \"kubernetes.io/projected/e6df7b62-9be2-42bf-a5bb-15dfc389a34b-kube-api-access-z8kl7\") pod \"machine-approver-56656f9798-kt4rr\" (UID: \"e6df7b62-9be2-42bf-a5bb-15dfc389a34b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kt4rr" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.176457 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t8f6l" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.181656 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.186675 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94nj2\" (UniqueName: \"kubernetes.io/projected/beaa2552-3c16-4136-99e5-50e5eb116f04-kube-api-access-94nj2\") pod \"route-controller-manager-6576b87f9c-w8gvn\" (UID: \"beaa2552-3c16-4136-99e5-50e5eb116f04\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.202715 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.223025 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.241976 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.260438 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7c8v7"] Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.264234 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.283574 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.298621 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk"] Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.301193 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.322376 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.341963 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.361504 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.379018 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t8f6l"] Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.381065 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 01 12:39:24 crc kubenswrapper[4727]: W1001 12:39:24.386073 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod423d089e_1cb8_48a0_8673_74d55aebe4f4.slice/crio-b52622b4cd0c650c27c5b2d709e38ef3e01553431a28d7e9bc8d50634ddc4e07 WatchSource:0}: Error finding container b52622b4cd0c650c27c5b2d709e38ef3e01553431a28d7e9bc8d50634ddc4e07: Status 404 returned error can't find the container with id b52622b4cd0c650c27c5b2d709e38ef3e01553431a28d7e9bc8d50634ddc4e07 Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.398720 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.401167 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.405730 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-psvph"] Oct 01 12:39:24 crc kubenswrapper[4727]: W1001 12:39:24.417399 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode419dff2_2c6a_4c89_8d99_0374397903b1.slice/crio-efbf176898ef6d3474d578192807742835fcba51d2d98dae2ed42592d3c53ac2 WatchSource:0}: Error finding container efbf176898ef6d3474d578192807742835fcba51d2d98dae2ed42592d3c53ac2: Status 404 returned error can't find the container with id efbf176898ef6d3474d578192807742835fcba51d2d98dae2ed42592d3c53ac2 Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.421509 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.439458 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kt4rr" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.445814 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.461845 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.484902 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.505108 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.550934 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d15984be-56fb-4ab5-9ede-fcd1bf6aefce-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n87vl\" (UID: \"d15984be-56fb-4ab5-9ede-fcd1bf6aefce\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n87vl" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.560314 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2hbs\" (UniqueName: \"kubernetes.io/projected/e491c69e-2845-4958-8b77-ba6aa4afc6fa-kube-api-access-s2hbs\") pod \"cluster-samples-operator-665b6dd947-mhnsv\" (UID: \"e491c69e-2845-4958-8b77-ba6aa4afc6fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mhnsv" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.580781 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4mtd\" (UniqueName: \"kubernetes.io/projected/13bfb813-f506-4bc0-9296-6a3d756968a7-kube-api-access-b4mtd\") pod \"olm-operator-6b444d44fb-h82nm\" (UID: \"13bfb813-f506-4bc0-9296-6a3d756968a7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h82nm" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.596525 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4qzn\" (UniqueName: \"kubernetes.io/projected/4d871c42-cfe9-4f9d-80b3-2ccef1246050-kube-api-access-q4qzn\") pod \"controller-manager-879f6c89f-zhq2q\" (UID: \"4d871c42-cfe9-4f9d-80b3-2ccef1246050\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.599342 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn"] Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.603281 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.617359 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvftf\" (UniqueName: \"kubernetes.io/projected/286216d2-0a22-42ea-bbc7-40fbe51a6f98-kube-api-access-xvftf\") pod \"migrator-59844c95c7-t4nz6\" (UID: \"286216d2-0a22-42ea-bbc7-40fbe51a6f98\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-t4nz6" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.633886 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n87vl" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.639834 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mhnsv" Oct 01 12:39:24 crc kubenswrapper[4727]: W1001 12:39:24.643828 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbeaa2552_3c16_4136_99e5_50e5eb116f04.slice/crio-611aed8fb83ef99febeee40833bc0d52084a812585aa8f86088202b1feb1ceae WatchSource:0}: Error finding container 611aed8fb83ef99febeee40833bc0d52084a812585aa8f86088202b1feb1ceae: Status 404 returned error can't find the container with id 611aed8fb83ef99febeee40833bc0d52084a812585aa8f86088202b1feb1ceae Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.653542 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-t4nz6" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.656603 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nw4p\" (UniqueName: \"kubernetes.io/projected/32ddbc86-f1a5-49b8-9418-f02063aa6637-kube-api-access-2nw4p\") pod \"console-operator-58897d9998-vbhpd\" (UID: \"32ddbc86-f1a5-49b8-9418-f02063aa6637\") " pod="openshift-console-operator/console-operator-58897d9998-vbhpd" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.670515 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n4vw\" (UniqueName: \"kubernetes.io/projected/358a6cda-2c70-4bf3-847a-8f5417bf13ce-kube-api-access-6n4vw\") pod \"etcd-operator-b45778765-fdjkz\" (UID: \"358a6cda-2c70-4bf3-847a-8f5417bf13ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fdjkz" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.677477 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/01db707e-986e-4b34-ba57-8f184b7ebcc5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-djszs\" (UID: \"01db707e-986e-4b34-ba57-8f184b7ebcc5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djszs" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.680326 4727 request.go:700] Waited for 1.879251037s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/serviceaccounts/openshift-config-operator/token Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.690579 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h82nm" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.696614 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67n6k\" (UniqueName: \"kubernetes.io/projected/fe4ee3a0-3756-49f8-88f4-21bc1113845d-kube-api-access-67n6k\") pod \"openshift-config-operator-7777fb866f-d4gnf\" (UID: \"fe4ee3a0-3756-49f8-88f4-21bc1113845d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4gnf" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.758071 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xt8w\" (UniqueName: \"kubernetes.io/projected/366b7e92-ea45-4052-8ddc-9540d534a7ad-kube-api-access-2xt8w\") pod \"console-f9d7485db-89tz6\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.759694 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/314cd705-8127-4d02-b9c2-d2c731733ec3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-hls2x\" (UID: \"314cd705-8127-4d02-b9c2-d2c731733ec3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hls2x" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.777789 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2z7h\" (UniqueName: \"kubernetes.io/projected/393e430f-d192-4a64-a39b-fba4a1b1897e-kube-api-access-f2z7h\") pod \"openshift-controller-manager-operator-756b6f6bc6-w2j4h\" (UID: \"393e430f-d192-4a64-a39b-fba4a1b1897e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-w2j4h" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.818204 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htzkd\" (UniqueName: \"kubernetes.io/projected/d774a5b7-171a-47c7-8d71-9497eb856102-kube-api-access-htzkd\") pod \"machine-config-controller-84d6567774-28zwl\" (UID: \"d774a5b7-171a-47c7-8d71-9497eb856102\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-28zwl" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.821775 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.828176 4727 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.847762 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk7pk\" (UniqueName: \"kubernetes.io/projected/ad83fce3-f3c7-42eb-8c3e-1b2d964a6dd2-kube-api-access-jk7pk\") pod \"openshift-apiserver-operator-796bbdcf4f-7v5w8\" (UID: \"ad83fce3-f3c7-42eb-8c3e-1b2d964a6dd2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7v5w8" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.850345 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fdjkz" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.858403 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.862568 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.864685 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4gnf" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.875422 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vbhpd" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.882353 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.882911 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.890373 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-w2j4h" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.907464 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.919814 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mhnsv"] Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.925946 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hls2x" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.946260 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7v5w8" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.947892 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.948573 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s79pt\" (UniqueName: \"kubernetes.io/projected/54454532-1909-4aa9-b17e-f244107b202e-kube-api-access-s79pt\") pod \"oauth-openshift-558db77b4-dwszm\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.949375 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zhq2q"] Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.962151 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.963155 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djszs" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.969474 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n87vl"] Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.976802 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-28zwl" Oct 01 12:39:24 crc kubenswrapper[4727]: I1001 12:39:24.982219 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.004744 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.016858 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-t4nz6"] Oct 01 12:39:25 crc kubenswrapper[4727]: W1001 12:39:25.047386 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod286216d2_0a22_42ea_bbc7_40fbe51a6f98.slice/crio-23465d55e4170a69b2d4f8e88a5677acc86177c9a8727c19a7ff08c6b7d22e81 WatchSource:0}: Error finding container 23465d55e4170a69b2d4f8e88a5677acc86177c9a8727c19a7ff08c6b7d22e81: Status 404 returned error can't find the container with id 23465d55e4170a69b2d4f8e88a5677acc86177c9a8727c19a7ff08c6b7d22e81 Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.078493 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h82nm"] Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.113264 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.114428 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6677870d-3e55-4f26-a052-4bfbb396b164-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-p7692\" (UID: \"6677870d-3e55-4f26-a052-4bfbb396b164\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p7692" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.114447 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ea389964-1da2-4ade-8772-b8bd1a76cc27-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sz95m\" (UID: \"ea389964-1da2-4ade-8772-b8bd1a76cc27\") " pod="openshift-marketplace/marketplace-operator-79b997595-sz95m" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.114469 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73638d71-c9ed-4ad0-866d-67c36b52de3e-trusted-ca\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.114492 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6677870d-3e55-4f26-a052-4bfbb396b164-config\") pod \"authentication-operator-69f744f599-p7692\" (UID: \"6677870d-3e55-4f26-a052-4bfbb396b164\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p7692" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.114510 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea389964-1da2-4ade-8772-b8bd1a76cc27-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sz95m\" (UID: \"ea389964-1da2-4ade-8772-b8bd1a76cc27\") " pod="openshift-marketplace/marketplace-operator-79b997595-sz95m" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.114527 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ftsr\" (UniqueName: \"kubernetes.io/projected/ea389964-1da2-4ade-8772-b8bd1a76cc27-kube-api-access-2ftsr\") pod \"marketplace-operator-79b997595-sz95m\" (UID: \"ea389964-1da2-4ade-8772-b8bd1a76cc27\") " pod="openshift-marketplace/marketplace-operator-79b997595-sz95m" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.114541 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt9vf\" (UniqueName: \"kubernetes.io/projected/61ea519c-4d97-4e3e-b932-51a3f8e2e07f-kube-api-access-bt9vf\") pod \"downloads-7954f5f757-qktm7\" (UID: \"61ea519c-4d97-4e3e-b932-51a3f8e2e07f\") " pod="openshift-console/downloads-7954f5f757-qktm7" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.114564 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6x4g\" (UniqueName: \"kubernetes.io/projected/73638d71-c9ed-4ad0-866d-67c36b52de3e-kube-api-access-b6x4g\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.114577 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/784aadbb-0b96-4110-8ca3-7c38ca2456e4-metrics-tls\") pod \"ingress-operator-5b745b69d9-lpgxf\" (UID: \"784aadbb-0b96-4110-8ca3-7c38ca2456e4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lpgxf" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.114590 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr2sp\" (UniqueName: \"kubernetes.io/projected/784aadbb-0b96-4110-8ca3-7c38ca2456e4-kube-api-access-tr2sp\") pod \"ingress-operator-5b745b69d9-lpgxf\" (UID: \"784aadbb-0b96-4110-8ca3-7c38ca2456e4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lpgxf" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.114605 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/73638d71-c9ed-4ad0-866d-67c36b52de3e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.114643 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73638d71-c9ed-4ad0-866d-67c36b52de3e-bound-sa-token\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.114657 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dbcm\" (UniqueName: \"kubernetes.io/projected/f2187b19-c6d3-4d35-89e0-bf1124ab524f-kube-api-access-4dbcm\") pod \"dns-operator-744455d44c-6rldk\" (UID: \"f2187b19-c6d3-4d35-89e0-bf1124ab524f\") " pod="openshift-dns-operator/dns-operator-744455d44c-6rldk" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.114673 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/73638d71-c9ed-4ad0-866d-67c36b52de3e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.114698 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6677870d-3e55-4f26-a052-4bfbb396b164-service-ca-bundle\") pod \"authentication-operator-69f744f599-p7692\" (UID: \"6677870d-3e55-4f26-a052-4bfbb396b164\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p7692" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.114715 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73638d71-c9ed-4ad0-866d-67c36b52de3e-registry-tls\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.114727 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6677870d-3e55-4f26-a052-4bfbb396b164-serving-cert\") pod \"authentication-operator-69f744f599-p7692\" (UID: \"6677870d-3e55-4f26-a052-4bfbb396b164\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p7692" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.114743 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/784aadbb-0b96-4110-8ca3-7c38ca2456e4-trusted-ca\") pod \"ingress-operator-5b745b69d9-lpgxf\" (UID: \"784aadbb-0b96-4110-8ca3-7c38ca2456e4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lpgxf" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.114756 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66ptt\" (UniqueName: \"kubernetes.io/projected/6677870d-3e55-4f26-a052-4bfbb396b164-kube-api-access-66ptt\") pod \"authentication-operator-69f744f599-p7692\" (UID: \"6677870d-3e55-4f26-a052-4bfbb396b164\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p7692" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.114772 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2187b19-c6d3-4d35-89e0-bf1124ab524f-metrics-tls\") pod \"dns-operator-744455d44c-6rldk\" (UID: \"f2187b19-c6d3-4d35-89e0-bf1124ab524f\") " pod="openshift-dns-operator/dns-operator-744455d44c-6rldk" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.114797 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/73638d71-c9ed-4ad0-866d-67c36b52de3e-registry-certificates\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.114814 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/784aadbb-0b96-4110-8ca3-7c38ca2456e4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lpgxf\" (UID: \"784aadbb-0b96-4110-8ca3-7c38ca2456e4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lpgxf" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.114840 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:25 crc kubenswrapper[4727]: E1001 12:39:25.115095 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:25.615083701 +0000 UTC m=+143.936438528 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.141768 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d4gnf"] Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.145334 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-t4nz6" event={"ID":"286216d2-0a22-42ea-bbc7-40fbe51a6f98","Type":"ContainerStarted","Data":"23465d55e4170a69b2d4f8e88a5677acc86177c9a8727c19a7ff08c6b7d22e81"} Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.146940 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" event={"ID":"4d871c42-cfe9-4f9d-80b3-2ccef1246050","Type":"ContainerStarted","Data":"94b659bd9f3453ee7ae3cf5e0efd459c8fb9d9e901a9ecf4150fd5856b01ac3a"} Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.149743 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-psvph" event={"ID":"e419dff2-2c6a-4c89-8d99-0374397903b1","Type":"ContainerStarted","Data":"56144d97730cbe6fbf2aef82024817b4aa749bb13b0520258ac592adb3a966c1"} Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.149835 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-psvph" event={"ID":"e419dff2-2c6a-4c89-8d99-0374397903b1","Type":"ContainerStarted","Data":"efbf176898ef6d3474d578192807742835fcba51d2d98dae2ed42592d3c53ac2"} Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.151463 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t8f6l" event={"ID":"423d089e-1cb8-48a0-8673-74d55aebe4f4","Type":"ContainerStarted","Data":"2502d31e8c8e7901f97fe6cc7f3d397f540e6e0e8008f58c7df3639b954a3872"} Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.151503 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t8f6l" event={"ID":"423d089e-1cb8-48a0-8673-74d55aebe4f4","Type":"ContainerStarted","Data":"b52622b4cd0c650c27c5b2d709e38ef3e01553431a28d7e9bc8d50634ddc4e07"} Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.157396 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n87vl" event={"ID":"d15984be-56fb-4ab5-9ede-fcd1bf6aefce","Type":"ContainerStarted","Data":"fae7b14ee1a441303dbe89d85b8d563742525d42ff071c51ed2897f3f0b3924d"} Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.161751 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h82nm" event={"ID":"13bfb813-f506-4bc0-9296-6a3d756968a7","Type":"ContainerStarted","Data":"5f4330131077d7e32494aa02c7ca608ea7fdb77ca236f63bcffb0851552e8562"} Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.163275 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn" event={"ID":"beaa2552-3c16-4136-99e5-50e5eb116f04","Type":"ContainerStarted","Data":"611aed8fb83ef99febeee40833bc0d52084a812585aa8f86088202b1feb1ceae"} Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.164409 4727 generic.go:334] "Generic (PLEG): container finished" podID="b19dbbbe-8e00-400e-8499-7ebdf954faa2" containerID="7a4f16b498d8e02a1d2a3d8ab6c0e5bb572d12d84cdc8af51ae2a1943906cbcf" exitCode=0 Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.164511 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" event={"ID":"b19dbbbe-8e00-400e-8499-7ebdf954faa2","Type":"ContainerDied","Data":"7a4f16b498d8e02a1d2a3d8ab6c0e5bb572d12d84cdc8af51ae2a1943906cbcf"} Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.164527 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" event={"ID":"b19dbbbe-8e00-400e-8499-7ebdf954faa2","Type":"ContainerStarted","Data":"a9ba742093c230fdde32b35b17f397e4d66e17a74e93eadb43951276932b47ef"} Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.175724 4727 generic.go:334] "Generic (PLEG): container finished" podID="2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d" containerID="4e988f710fa3bed29e7ffa1aa33c9fe25cebd84cb2736cb231c6118e84081143" exitCode=0 Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.175834 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" event={"ID":"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d","Type":"ContainerDied","Data":"4e988f710fa3bed29e7ffa1aa33c9fe25cebd84cb2736cb231c6118e84081143"} Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.175864 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" event={"ID":"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d","Type":"ContainerStarted","Data":"b40472aa67a31740f28a6fdc23145e0a99075624ebd9f41c6d2b1f51434145ab"} Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.178498 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kt4rr" event={"ID":"e6df7b62-9be2-42bf-a5bb-15dfc389a34b","Type":"ContainerStarted","Data":"25dabab9a14fad7ddbcb3c62840744863be48ba049f6b85dcbfab872549b77ae"} Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.181495 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vbhpd"] Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.190211 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-89tz6"] Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.217718 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:25 crc kubenswrapper[4727]: E1001 12:39:25.217824 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:25.717800388 +0000 UTC m=+144.039155225 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.218210 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjgnr\" (UniqueName: \"kubernetes.io/projected/ffda1e5d-0dc9-400a-97b2-e2f7e7773c04-kube-api-access-mjgnr\") pod \"package-server-manager-789f6589d5-g6h55\" (UID: \"ffda1e5d-0dc9-400a-97b2-e2f7e7773c04\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6h55" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.219738 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6677870d-3e55-4f26-a052-4bfbb396b164-service-ca-bundle\") pod \"authentication-operator-69f744f599-p7692\" (UID: \"6677870d-3e55-4f26-a052-4bfbb396b164\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p7692" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.219898 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73638d71-c9ed-4ad0-866d-67c36b52de3e-registry-tls\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.220110 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6677870d-3e55-4f26-a052-4bfbb396b164-serving-cert\") pod \"authentication-operator-69f744f599-p7692\" (UID: \"6677870d-3e55-4f26-a052-4bfbb396b164\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p7692" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.220883 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/784aadbb-0b96-4110-8ca3-7c38ca2456e4-trusted-ca\") pod \"ingress-operator-5b745b69d9-lpgxf\" (UID: \"784aadbb-0b96-4110-8ca3-7c38ca2456e4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lpgxf" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.221292 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66ptt\" (UniqueName: \"kubernetes.io/projected/6677870d-3e55-4f26-a052-4bfbb396b164-kube-api-access-66ptt\") pod \"authentication-operator-69f744f599-p7692\" (UID: \"6677870d-3e55-4f26-a052-4bfbb396b164\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p7692" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.221988 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2187b19-c6d3-4d35-89e0-bf1124ab524f-metrics-tls\") pod \"dns-operator-744455d44c-6rldk\" (UID: \"f2187b19-c6d3-4d35-89e0-bf1124ab524f\") " pod="openshift-dns-operator/dns-operator-744455d44c-6rldk" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.222085 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6677870d-3e55-4f26-a052-4bfbb396b164-service-ca-bundle\") pod \"authentication-operator-69f744f599-p7692\" (UID: \"6677870d-3e55-4f26-a052-4bfbb396b164\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p7692" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.222200 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/784aadbb-0b96-4110-8ca3-7c38ca2456e4-trusted-ca\") pod \"ingress-operator-5b745b69d9-lpgxf\" (UID: \"784aadbb-0b96-4110-8ca3-7c38ca2456e4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lpgxf" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.224428 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/73638d71-c9ed-4ad0-866d-67c36b52de3e-registry-certificates\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.224484 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/784aadbb-0b96-4110-8ca3-7c38ca2456e4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lpgxf\" (UID: \"784aadbb-0b96-4110-8ca3-7c38ca2456e4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lpgxf" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.225564 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6677870d-3e55-4f26-a052-4bfbb396b164-serving-cert\") pod \"authentication-operator-69f744f599-p7692\" (UID: \"6677870d-3e55-4f26-a052-4bfbb396b164\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p7692" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.225900 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.226097 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/73638d71-c9ed-4ad0-866d-67c36b52de3e-registry-certificates\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.226421 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ea389964-1da2-4ade-8772-b8bd1a76cc27-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sz95m\" (UID: \"ea389964-1da2-4ade-8772-b8bd1a76cc27\") " pod="openshift-marketplace/marketplace-operator-79b997595-sz95m" Oct 01 12:39:25 crc kubenswrapper[4727]: E1001 12:39:25.226448 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:25.726431811 +0000 UTC m=+144.047786648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.226475 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6677870d-3e55-4f26-a052-4bfbb396b164-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-p7692\" (UID: \"6677870d-3e55-4f26-a052-4bfbb396b164\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p7692" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.227474 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f2187b19-c6d3-4d35-89e0-bf1124ab524f-metrics-tls\") pod \"dns-operator-744455d44c-6rldk\" (UID: \"f2187b19-c6d3-4d35-89e0-bf1124ab524f\") " pod="openshift-dns-operator/dns-operator-744455d44c-6rldk" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.227819 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73638d71-c9ed-4ad0-866d-67c36b52de3e-trusted-ca\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.228414 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ffda1e5d-0dc9-400a-97b2-e2f7e7773c04-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-g6h55\" (UID: \"ffda1e5d-0dc9-400a-97b2-e2f7e7773c04\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6h55" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.228632 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6677870d-3e55-4f26-a052-4bfbb396b164-config\") pod \"authentication-operator-69f744f599-p7692\" (UID: \"6677870d-3e55-4f26-a052-4bfbb396b164\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p7692" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.228723 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73638d71-c9ed-4ad0-866d-67c36b52de3e-registry-tls\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.228732 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea389964-1da2-4ade-8772-b8bd1a76cc27-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sz95m\" (UID: \"ea389964-1da2-4ade-8772-b8bd1a76cc27\") " pod="openshift-marketplace/marketplace-operator-79b997595-sz95m" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.228912 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ftsr\" (UniqueName: \"kubernetes.io/projected/ea389964-1da2-4ade-8772-b8bd1a76cc27-kube-api-access-2ftsr\") pod \"marketplace-operator-79b997595-sz95m\" (UID: \"ea389964-1da2-4ade-8772-b8bd1a76cc27\") " pod="openshift-marketplace/marketplace-operator-79b997595-sz95m" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.229036 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt9vf\" (UniqueName: \"kubernetes.io/projected/61ea519c-4d97-4e3e-b932-51a3f8e2e07f-kube-api-access-bt9vf\") pod \"downloads-7954f5f757-qktm7\" (UID: \"61ea519c-4d97-4e3e-b932-51a3f8e2e07f\") " pod="openshift-console/downloads-7954f5f757-qktm7" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.229292 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6x4g\" (UniqueName: \"kubernetes.io/projected/73638d71-c9ed-4ad0-866d-67c36b52de3e-kube-api-access-b6x4g\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.229387 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/784aadbb-0b96-4110-8ca3-7c38ca2456e4-metrics-tls\") pod \"ingress-operator-5b745b69d9-lpgxf\" (UID: \"784aadbb-0b96-4110-8ca3-7c38ca2456e4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lpgxf" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.229472 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr2sp\" (UniqueName: \"kubernetes.io/projected/784aadbb-0b96-4110-8ca3-7c38ca2456e4-kube-api-access-tr2sp\") pod \"ingress-operator-5b745b69d9-lpgxf\" (UID: \"784aadbb-0b96-4110-8ca3-7c38ca2456e4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lpgxf" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.229569 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/73638d71-c9ed-4ad0-866d-67c36b52de3e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.230058 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73638d71-c9ed-4ad0-866d-67c36b52de3e-bound-sa-token\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.230150 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dbcm\" (UniqueName: \"kubernetes.io/projected/f2187b19-c6d3-4d35-89e0-bf1124ab524f-kube-api-access-4dbcm\") pod \"dns-operator-744455d44c-6rldk\" (UID: \"f2187b19-c6d3-4d35-89e0-bf1124ab524f\") " pod="openshift-dns-operator/dns-operator-744455d44c-6rldk" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.230248 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/73638d71-c9ed-4ad0-866d-67c36b52de3e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.230798 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea389964-1da2-4ade-8772-b8bd1a76cc27-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sz95m\" (UID: \"ea389964-1da2-4ade-8772-b8bd1a76cc27\") " pod="openshift-marketplace/marketplace-operator-79b997595-sz95m" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.230934 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73638d71-c9ed-4ad0-866d-67c36b52de3e-trusted-ca\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.231735 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6677870d-3e55-4f26-a052-4bfbb396b164-config\") pod \"authentication-operator-69f744f599-p7692\" (UID: \"6677870d-3e55-4f26-a052-4bfbb396b164\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p7692" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.236106 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/73638d71-c9ed-4ad0-866d-67c36b52de3e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.238357 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/73638d71-c9ed-4ad0-866d-67c36b52de3e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.239220 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ea389964-1da2-4ade-8772-b8bd1a76cc27-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sz95m\" (UID: \"ea389964-1da2-4ade-8772-b8bd1a76cc27\") " pod="openshift-marketplace/marketplace-operator-79b997595-sz95m" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.240742 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/784aadbb-0b96-4110-8ca3-7c38ca2456e4-metrics-tls\") pod \"ingress-operator-5b745b69d9-lpgxf\" (UID: \"784aadbb-0b96-4110-8ca3-7c38ca2456e4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lpgxf" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.264542 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66ptt\" (UniqueName: \"kubernetes.io/projected/6677870d-3e55-4f26-a052-4bfbb396b164-kube-api-access-66ptt\") pod \"authentication-operator-69f744f599-p7692\" (UID: \"6677870d-3e55-4f26-a052-4bfbb396b164\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p7692" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.266879 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6677870d-3e55-4f26-a052-4bfbb396b164-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-p7692\" (UID: \"6677870d-3e55-4f26-a052-4bfbb396b164\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p7692" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.309534 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/784aadbb-0b96-4110-8ca3-7c38ca2456e4-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lpgxf\" (UID: \"784aadbb-0b96-4110-8ca3-7c38ca2456e4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lpgxf" Oct 01 12:39:25 crc kubenswrapper[4727]: W1001 12:39:25.310296 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod366b7e92_ea45_4052_8ddc_9540d534a7ad.slice/crio-034be7a53c854ca94b0b98036eb730dad3d32ff6bc0701a35eea425000108c89 WatchSource:0}: Error finding container 034be7a53c854ca94b0b98036eb730dad3d32ff6bc0701a35eea425000108c89: Status 404 returned error can't find the container with id 034be7a53c854ca94b0b98036eb730dad3d32ff6bc0701a35eea425000108c89 Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.320874 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ftsr\" (UniqueName: \"kubernetes.io/projected/ea389964-1da2-4ade-8772-b8bd1a76cc27-kube-api-access-2ftsr\") pod \"marketplace-operator-79b997595-sz95m\" (UID: \"ea389964-1da2-4ade-8772-b8bd1a76cc27\") " pod="openshift-marketplace/marketplace-operator-79b997595-sz95m" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.325472 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7v5w8"] Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.331532 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.331718 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a6fe7e2-cdb3-49f0-8697-60de951eff58-config\") pod \"service-ca-operator-777779d784-lkgbs\" (UID: \"3a6fe7e2-cdb3-49f0-8697-60de951eff58\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lkgbs" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.331748 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/228aaa44-0de4-45e8-87d9-78ad4fa70f2e-stats-auth\") pod \"router-default-5444994796-8x58q\" (UID: \"228aaa44-0de4-45e8-87d9-78ad4fa70f2e\") " pod="openshift-ingress/router-default-5444994796-8x58q" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.331768 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2x7q\" (UniqueName: \"kubernetes.io/projected/1eac8826-912b-4187-a3ce-3cbde72f1839-kube-api-access-f2x7q\") pod \"service-ca-9c57cc56f-whs6s\" (UID: \"1eac8826-912b-4187-a3ce-3cbde72f1839\") " pod="openshift-service-ca/service-ca-9c57cc56f-whs6s" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.331790 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aad21b5e-c192-46d8-9cbe-516f5dc5def2-webhook-cert\") pod \"packageserver-d55dfcdfc-84klz\" (UID: \"aad21b5e-c192-46d8-9cbe-516f5dc5def2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-84klz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.331809 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/38f722c5-5d96-4c10-a4da-724f25123439-metrics-tls\") pod \"dns-default-cfmpf\" (UID: \"38f722c5-5d96-4c10-a4da-724f25123439\") " pod="openshift-dns/dns-default-cfmpf" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.331826 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/172bc9e3-a420-4a31-a309-98c533dfdb4f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-qhnkw\" (UID: \"172bc9e3-a420-4a31-a309-98c533dfdb4f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qhnkw" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.331883 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a6fe7e2-cdb3-49f0-8697-60de951eff58-serving-cert\") pod \"service-ca-operator-777779d784-lkgbs\" (UID: \"3a6fe7e2-cdb3-49f0-8697-60de951eff58\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lkgbs" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.331902 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9b6c239-689b-49b7-bdd1-0f3bdc7e53d6-cert\") pod \"ingress-canary-42q5h\" (UID: \"c9b6c239-689b-49b7-bdd1-0f3bdc7e53d6\") " pod="openshift-ingress-canary/ingress-canary-42q5h" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.331920 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/59f2fda4-ad42-4bd7-89c8-9f1a8c54123c-registration-dir\") pod \"csi-hostpathplugin-tcfrz\" (UID: \"59f2fda4-ad42-4bd7-89c8-9f1a8c54123c\") " pod="hostpath-provisioner/csi-hostpathplugin-tcfrz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.331956 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/947ca08c-3484-42d7-9c5d-f8e7e1c7308d-node-bootstrap-token\") pod \"machine-config-server-cxgjh\" (UID: \"947ca08c-3484-42d7-9c5d-f8e7e1c7308d\") " pod="openshift-machine-config-operator/machine-config-server-cxgjh" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.332058 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8vcs\" (UniqueName: \"kubernetes.io/projected/e70b82d9-ed36-4c43-a868-e4d7f1b6ecd1-kube-api-access-c8vcs\") pod \"machine-config-operator-74547568cd-2447c\" (UID: \"e70b82d9-ed36-4c43-a868-e4d7f1b6ecd1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2447c" Oct 01 12:39:25 crc kubenswrapper[4727]: E1001 12:39:25.332161 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:25.832094767 +0000 UTC m=+144.153449834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.332259 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ffda1e5d-0dc9-400a-97b2-e2f7e7773c04-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-g6h55\" (UID: \"ffda1e5d-0dc9-400a-97b2-e2f7e7773c04\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6h55" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.332656 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fknps\" (UniqueName: \"kubernetes.io/projected/3a6fe7e2-cdb3-49f0-8697-60de951eff58-kube-api-access-fknps\") pod \"service-ca-operator-777779d784-lkgbs\" (UID: \"3a6fe7e2-cdb3-49f0-8697-60de951eff58\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lkgbs" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.332773 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e70b82d9-ed36-4c43-a868-e4d7f1b6ecd1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2447c\" (UID: \"e70b82d9-ed36-4c43-a868-e4d7f1b6ecd1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2447c" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.332832 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/59f2fda4-ad42-4bd7-89c8-9f1a8c54123c-csi-data-dir\") pod \"csi-hostpathplugin-tcfrz\" (UID: \"59f2fda4-ad42-4bd7-89c8-9f1a8c54123c\") " pod="hostpath-provisioner/csi-hostpathplugin-tcfrz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.332867 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzfzb\" (UniqueName: \"kubernetes.io/projected/59f2fda4-ad42-4bd7-89c8-9f1a8c54123c-kube-api-access-jzfzb\") pod \"csi-hostpathplugin-tcfrz\" (UID: \"59f2fda4-ad42-4bd7-89c8-9f1a8c54123c\") " pod="hostpath-provisioner/csi-hostpathplugin-tcfrz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.332905 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38f722c5-5d96-4c10-a4da-724f25123439-config-volume\") pod \"dns-default-cfmpf\" (UID: \"38f722c5-5d96-4c10-a4da-724f25123439\") " pod="openshift-dns/dns-default-cfmpf" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.332946 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1eac8826-912b-4187-a3ce-3cbde72f1839-signing-key\") pod \"service-ca-9c57cc56f-whs6s\" (UID: \"1eac8826-912b-4187-a3ce-3cbde72f1839\") " pod="openshift-service-ca/service-ca-9c57cc56f-whs6s" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.332993 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1974282f-c2f4-48cd-97e2-9e880203ef1c-config-volume\") pod \"collect-profiles-29322030-zjx2n\" (UID: \"1974282f-c2f4-48cd-97e2-9e880203ef1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-zjx2n" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.333144 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1eac8826-912b-4187-a3ce-3cbde72f1839-signing-cabundle\") pod \"service-ca-9c57cc56f-whs6s\" (UID: \"1eac8826-912b-4187-a3ce-3cbde72f1839\") " pod="openshift-service-ca/service-ca-9c57cc56f-whs6s" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.333227 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/59f2fda4-ad42-4bd7-89c8-9f1a8c54123c-socket-dir\") pod \"csi-hostpathplugin-tcfrz\" (UID: \"59f2fda4-ad42-4bd7-89c8-9f1a8c54123c\") " pod="hostpath-provisioner/csi-hostpathplugin-tcfrz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.333250 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c61adaff-b097-44b1-b19e-daaf23012ac0-profile-collector-cert\") pod \"catalog-operator-68c6474976-nz7c2\" (UID: \"c61adaff-b097-44b1-b19e-daaf23012ac0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nz7c2" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.333274 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppwfh\" (UniqueName: \"kubernetes.io/projected/38f722c5-5d96-4c10-a4da-724f25123439-kube-api-access-ppwfh\") pod \"dns-default-cfmpf\" (UID: \"38f722c5-5d96-4c10-a4da-724f25123439\") " pod="openshift-dns/dns-default-cfmpf" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.333329 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/59f2fda4-ad42-4bd7-89c8-9f1a8c54123c-mountpoint-dir\") pod \"csi-hostpathplugin-tcfrz\" (UID: \"59f2fda4-ad42-4bd7-89c8-9f1a8c54123c\") " pod="hostpath-provisioner/csi-hostpathplugin-tcfrz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.333409 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snndw\" (UniqueName: \"kubernetes.io/projected/1974282f-c2f4-48cd-97e2-9e880203ef1c-kube-api-access-snndw\") pod \"collect-profiles-29322030-zjx2n\" (UID: \"1974282f-c2f4-48cd-97e2-9e880203ef1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-zjx2n" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.333629 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4npfh\" (UniqueName: \"kubernetes.io/projected/228aaa44-0de4-45e8-87d9-78ad4fa70f2e-kube-api-access-4npfh\") pod \"router-default-5444994796-8x58q\" (UID: \"228aaa44-0de4-45e8-87d9-78ad4fa70f2e\") " pod="openshift-ingress/router-default-5444994796-8x58q" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.334788 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/228aaa44-0de4-45e8-87d9-78ad4fa70f2e-default-certificate\") pod \"router-default-5444994796-8x58q\" (UID: \"228aaa44-0de4-45e8-87d9-78ad4fa70f2e\") " pod="openshift-ingress/router-default-5444994796-8x58q" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.334923 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlbft\" (UniqueName: \"kubernetes.io/projected/c61adaff-b097-44b1-b19e-daaf23012ac0-kube-api-access-qlbft\") pod \"catalog-operator-68c6474976-nz7c2\" (UID: \"c61adaff-b097-44b1-b19e-daaf23012ac0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nz7c2" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.334952 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/aad21b5e-c192-46d8-9cbe-516f5dc5def2-tmpfs\") pod \"packageserver-d55dfcdfc-84klz\" (UID: \"aad21b5e-c192-46d8-9cbe-516f5dc5def2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-84klz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.334975 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c61adaff-b097-44b1-b19e-daaf23012ac0-srv-cert\") pod \"catalog-operator-68c6474976-nz7c2\" (UID: \"c61adaff-b097-44b1-b19e-daaf23012ac0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nz7c2" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.335017 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e70b82d9-ed36-4c43-a868-e4d7f1b6ecd1-images\") pod \"machine-config-operator-74547568cd-2447c\" (UID: \"e70b82d9-ed36-4c43-a868-e4d7f1b6ecd1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2447c" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.335045 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjqbw\" (UniqueName: \"kubernetes.io/projected/aad21b5e-c192-46d8-9cbe-516f5dc5def2-kube-api-access-qjqbw\") pod \"packageserver-d55dfcdfc-84klz\" (UID: \"aad21b5e-c192-46d8-9cbe-516f5dc5def2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-84klz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.335076 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/947ca08c-3484-42d7-9c5d-f8e7e1c7308d-certs\") pod \"machine-config-server-cxgjh\" (UID: \"947ca08c-3484-42d7-9c5d-f8e7e1c7308d\") " pod="openshift-machine-config-operator/machine-config-server-cxgjh" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.335129 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e70b82d9-ed36-4c43-a868-e4d7f1b6ecd1-proxy-tls\") pod \"machine-config-operator-74547568cd-2447c\" (UID: \"e70b82d9-ed36-4c43-a868-e4d7f1b6ecd1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2447c" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.335165 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aad21b5e-c192-46d8-9cbe-516f5dc5def2-apiservice-cert\") pod \"packageserver-d55dfcdfc-84klz\" (UID: \"aad21b5e-c192-46d8-9cbe-516f5dc5def2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-84klz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.335190 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9d7314f2-4c1b-4b56-a55a-cf5c4b153c71-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mfjhb\" (UID: \"9d7314f2-4c1b-4b56-a55a-cf5c4b153c71\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mfjhb" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.335222 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zsv9\" (UniqueName: \"kubernetes.io/projected/947ca08c-3484-42d7-9c5d-f8e7e1c7308d-kube-api-access-5zsv9\") pod \"machine-config-server-cxgjh\" (UID: \"947ca08c-3484-42d7-9c5d-f8e7e1c7308d\") " pod="openshift-machine-config-operator/machine-config-server-cxgjh" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.335247 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpkgb\" (UniqueName: \"kubernetes.io/projected/d9815009-494f-4e87-9d55-da93dc61b078-kube-api-access-hpkgb\") pod \"control-plane-machine-set-operator-78cbb6b69f-xkkcc\" (UID: \"d9815009-494f-4e87-9d55-da93dc61b078\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkkcc" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.335277 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7vlt\" (UniqueName: \"kubernetes.io/projected/9d7314f2-4c1b-4b56-a55a-cf5c4b153c71-kube-api-access-p7vlt\") pod \"multus-admission-controller-857f4d67dd-mfjhb\" (UID: \"9d7314f2-4c1b-4b56-a55a-cf5c4b153c71\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mfjhb" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.335315 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjgnr\" (UniqueName: \"kubernetes.io/projected/ffda1e5d-0dc9-400a-97b2-e2f7e7773c04-kube-api-access-mjgnr\") pod \"package-server-manager-789f6589d5-g6h55\" (UID: \"ffda1e5d-0dc9-400a-97b2-e2f7e7773c04\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6h55" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.335343 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1974282f-c2f4-48cd-97e2-9e880203ef1c-secret-volume\") pod \"collect-profiles-29322030-zjx2n\" (UID: \"1974282f-c2f4-48cd-97e2-9e880203ef1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-zjx2n" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.335367 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/228aaa44-0de4-45e8-87d9-78ad4fa70f2e-service-ca-bundle\") pod \"router-default-5444994796-8x58q\" (UID: \"228aaa44-0de4-45e8-87d9-78ad4fa70f2e\") " pod="openshift-ingress/router-default-5444994796-8x58q" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.335393 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/172bc9e3-a420-4a31-a309-98c533dfdb4f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-qhnkw\" (UID: \"172bc9e3-a420-4a31-a309-98c533dfdb4f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qhnkw" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.335420 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/228aaa44-0de4-45e8-87d9-78ad4fa70f2e-metrics-certs\") pod \"router-default-5444994796-8x58q\" (UID: \"228aaa44-0de4-45e8-87d9-78ad4fa70f2e\") " pod="openshift-ingress/router-default-5444994796-8x58q" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.335477 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/59f2fda4-ad42-4bd7-89c8-9f1a8c54123c-plugins-dir\") pod \"csi-hostpathplugin-tcfrz\" (UID: \"59f2fda4-ad42-4bd7-89c8-9f1a8c54123c\") " pod="hostpath-provisioner/csi-hostpathplugin-tcfrz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.335530 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9815009-494f-4e87-9d55-da93dc61b078-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xkkcc\" (UID: \"d9815009-494f-4e87-9d55-da93dc61b078\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkkcc" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.336685 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ffda1e5d-0dc9-400a-97b2-e2f7e7773c04-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-g6h55\" (UID: \"ffda1e5d-0dc9-400a-97b2-e2f7e7773c04\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6h55" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.338085 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt9vf\" (UniqueName: \"kubernetes.io/projected/61ea519c-4d97-4e3e-b932-51a3f8e2e07f-kube-api-access-bt9vf\") pod \"downloads-7954f5f757-qktm7\" (UID: \"61ea519c-4d97-4e3e-b932-51a3f8e2e07f\") " pod="openshift-console/downloads-7954f5f757-qktm7" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.338405 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l95rd\" (UniqueName: \"kubernetes.io/projected/172bc9e3-a420-4a31-a309-98c533dfdb4f-kube-api-access-l95rd\") pod \"kube-storage-version-migrator-operator-b67b599dd-qhnkw\" (UID: \"172bc9e3-a420-4a31-a309-98c533dfdb4f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qhnkw" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.338447 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx8r2\" (UniqueName: \"kubernetes.io/projected/c9b6c239-689b-49b7-bdd1-0f3bdc7e53d6-kube-api-access-fx8r2\") pod \"ingress-canary-42q5h\" (UID: \"c9b6c239-689b-49b7-bdd1-0f3bdc7e53d6\") " pod="openshift-ingress-canary/ingress-canary-42q5h" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.360324 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6x4g\" (UniqueName: \"kubernetes.io/projected/73638d71-c9ed-4ad0-866d-67c36b52de3e-kube-api-access-b6x4g\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.368874 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djszs"] Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.379050 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73638d71-c9ed-4ad0-866d-67c36b52de3e-bound-sa-token\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.400417 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr2sp\" (UniqueName: \"kubernetes.io/projected/784aadbb-0b96-4110-8ca3-7c38ca2456e4-kube-api-access-tr2sp\") pod \"ingress-operator-5b745b69d9-lpgxf\" (UID: \"784aadbb-0b96-4110-8ca3-7c38ca2456e4\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lpgxf" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.416523 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dbcm\" (UniqueName: \"kubernetes.io/projected/f2187b19-c6d3-4d35-89e0-bf1124ab524f-kube-api-access-4dbcm\") pod \"dns-operator-744455d44c-6rldk\" (UID: \"f2187b19-c6d3-4d35-89e0-bf1124ab524f\") " pod="openshift-dns-operator/dns-operator-744455d44c-6rldk" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441264 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/59f2fda4-ad42-4bd7-89c8-9f1a8c54123c-csi-data-dir\") pod \"csi-hostpathplugin-tcfrz\" (UID: \"59f2fda4-ad42-4bd7-89c8-9f1a8c54123c\") " pod="hostpath-provisioner/csi-hostpathplugin-tcfrz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441296 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzfzb\" (UniqueName: \"kubernetes.io/projected/59f2fda4-ad42-4bd7-89c8-9f1a8c54123c-kube-api-access-jzfzb\") pod \"csi-hostpathplugin-tcfrz\" (UID: \"59f2fda4-ad42-4bd7-89c8-9f1a8c54123c\") " pod="hostpath-provisioner/csi-hostpathplugin-tcfrz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441317 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38f722c5-5d96-4c10-a4da-724f25123439-config-volume\") pod \"dns-default-cfmpf\" (UID: \"38f722c5-5d96-4c10-a4da-724f25123439\") " pod="openshift-dns/dns-default-cfmpf" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441335 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1eac8826-912b-4187-a3ce-3cbde72f1839-signing-key\") pod \"service-ca-9c57cc56f-whs6s\" (UID: \"1eac8826-912b-4187-a3ce-3cbde72f1839\") " pod="openshift-service-ca/service-ca-9c57cc56f-whs6s" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441357 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1974282f-c2f4-48cd-97e2-9e880203ef1c-config-volume\") pod \"collect-profiles-29322030-zjx2n\" (UID: \"1974282f-c2f4-48cd-97e2-9e880203ef1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-zjx2n" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441372 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1eac8826-912b-4187-a3ce-3cbde72f1839-signing-cabundle\") pod \"service-ca-9c57cc56f-whs6s\" (UID: \"1eac8826-912b-4187-a3ce-3cbde72f1839\") " pod="openshift-service-ca/service-ca-9c57cc56f-whs6s" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441397 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c61adaff-b097-44b1-b19e-daaf23012ac0-profile-collector-cert\") pod \"catalog-operator-68c6474976-nz7c2\" (UID: \"c61adaff-b097-44b1-b19e-daaf23012ac0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nz7c2" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441417 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/59f2fda4-ad42-4bd7-89c8-9f1a8c54123c-socket-dir\") pod \"csi-hostpathplugin-tcfrz\" (UID: \"59f2fda4-ad42-4bd7-89c8-9f1a8c54123c\") " pod="hostpath-provisioner/csi-hostpathplugin-tcfrz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441432 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppwfh\" (UniqueName: \"kubernetes.io/projected/38f722c5-5d96-4c10-a4da-724f25123439-kube-api-access-ppwfh\") pod \"dns-default-cfmpf\" (UID: \"38f722c5-5d96-4c10-a4da-724f25123439\") " pod="openshift-dns/dns-default-cfmpf" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441447 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/59f2fda4-ad42-4bd7-89c8-9f1a8c54123c-mountpoint-dir\") pod \"csi-hostpathplugin-tcfrz\" (UID: \"59f2fda4-ad42-4bd7-89c8-9f1a8c54123c\") " pod="hostpath-provisioner/csi-hostpathplugin-tcfrz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441462 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snndw\" (UniqueName: \"kubernetes.io/projected/1974282f-c2f4-48cd-97e2-9e880203ef1c-kube-api-access-snndw\") pod \"collect-profiles-29322030-zjx2n\" (UID: \"1974282f-c2f4-48cd-97e2-9e880203ef1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-zjx2n" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441473 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/59f2fda4-ad42-4bd7-89c8-9f1a8c54123c-csi-data-dir\") pod \"csi-hostpathplugin-tcfrz\" (UID: \"59f2fda4-ad42-4bd7-89c8-9f1a8c54123c\") " pod="hostpath-provisioner/csi-hostpathplugin-tcfrz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441485 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/228aaa44-0de4-45e8-87d9-78ad4fa70f2e-default-certificate\") pod \"router-default-5444994796-8x58q\" (UID: \"228aaa44-0de4-45e8-87d9-78ad4fa70f2e\") " pod="openshift-ingress/router-default-5444994796-8x58q" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441555 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4npfh\" (UniqueName: \"kubernetes.io/projected/228aaa44-0de4-45e8-87d9-78ad4fa70f2e-kube-api-access-4npfh\") pod \"router-default-5444994796-8x58q\" (UID: \"228aaa44-0de4-45e8-87d9-78ad4fa70f2e\") " pod="openshift-ingress/router-default-5444994796-8x58q" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441592 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlbft\" (UniqueName: \"kubernetes.io/projected/c61adaff-b097-44b1-b19e-daaf23012ac0-kube-api-access-qlbft\") pod \"catalog-operator-68c6474976-nz7c2\" (UID: \"c61adaff-b097-44b1-b19e-daaf23012ac0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nz7c2" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441609 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/aad21b5e-c192-46d8-9cbe-516f5dc5def2-tmpfs\") pod \"packageserver-d55dfcdfc-84klz\" (UID: \"aad21b5e-c192-46d8-9cbe-516f5dc5def2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-84klz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441628 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c61adaff-b097-44b1-b19e-daaf23012ac0-srv-cert\") pod \"catalog-operator-68c6474976-nz7c2\" (UID: \"c61adaff-b097-44b1-b19e-daaf23012ac0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nz7c2" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441643 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjqbw\" (UniqueName: \"kubernetes.io/projected/aad21b5e-c192-46d8-9cbe-516f5dc5def2-kube-api-access-qjqbw\") pod \"packageserver-d55dfcdfc-84klz\" (UID: \"aad21b5e-c192-46d8-9cbe-516f5dc5def2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-84klz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441661 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e70b82d9-ed36-4c43-a868-e4d7f1b6ecd1-images\") pod \"machine-config-operator-74547568cd-2447c\" (UID: \"e70b82d9-ed36-4c43-a868-e4d7f1b6ecd1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2447c" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441685 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/947ca08c-3484-42d7-9c5d-f8e7e1c7308d-certs\") pod \"machine-config-server-cxgjh\" (UID: \"947ca08c-3484-42d7-9c5d-f8e7e1c7308d\") " pod="openshift-machine-config-operator/machine-config-server-cxgjh" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441723 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e70b82d9-ed36-4c43-a868-e4d7f1b6ecd1-proxy-tls\") pod \"machine-config-operator-74547568cd-2447c\" (UID: \"e70b82d9-ed36-4c43-a868-e4d7f1b6ecd1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2447c" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441746 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9d7314f2-4c1b-4b56-a55a-cf5c4b153c71-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mfjhb\" (UID: \"9d7314f2-4c1b-4b56-a55a-cf5c4b153c71\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mfjhb" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441765 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aad21b5e-c192-46d8-9cbe-516f5dc5def2-apiservice-cert\") pod \"packageserver-d55dfcdfc-84klz\" (UID: \"aad21b5e-c192-46d8-9cbe-516f5dc5def2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-84klz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441791 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zsv9\" (UniqueName: \"kubernetes.io/projected/947ca08c-3484-42d7-9c5d-f8e7e1c7308d-kube-api-access-5zsv9\") pod \"machine-config-server-cxgjh\" (UID: \"947ca08c-3484-42d7-9c5d-f8e7e1c7308d\") " pod="openshift-machine-config-operator/machine-config-server-cxgjh" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441808 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpkgb\" (UniqueName: \"kubernetes.io/projected/d9815009-494f-4e87-9d55-da93dc61b078-kube-api-access-hpkgb\") pod \"control-plane-machine-set-operator-78cbb6b69f-xkkcc\" (UID: \"d9815009-494f-4e87-9d55-da93dc61b078\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkkcc" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441832 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7vlt\" (UniqueName: \"kubernetes.io/projected/9d7314f2-4c1b-4b56-a55a-cf5c4b153c71-kube-api-access-p7vlt\") pod \"multus-admission-controller-857f4d67dd-mfjhb\" (UID: \"9d7314f2-4c1b-4b56-a55a-cf5c4b153c71\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mfjhb" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441850 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1974282f-c2f4-48cd-97e2-9e880203ef1c-secret-volume\") pod \"collect-profiles-29322030-zjx2n\" (UID: \"1974282f-c2f4-48cd-97e2-9e880203ef1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-zjx2n" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441867 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/228aaa44-0de4-45e8-87d9-78ad4fa70f2e-service-ca-bundle\") pod \"router-default-5444994796-8x58q\" (UID: \"228aaa44-0de4-45e8-87d9-78ad4fa70f2e\") " pod="openshift-ingress/router-default-5444994796-8x58q" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441882 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/172bc9e3-a420-4a31-a309-98c533dfdb4f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-qhnkw\" (UID: \"172bc9e3-a420-4a31-a309-98c533dfdb4f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qhnkw" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441906 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/228aaa44-0de4-45e8-87d9-78ad4fa70f2e-metrics-certs\") pod \"router-default-5444994796-8x58q\" (UID: \"228aaa44-0de4-45e8-87d9-78ad4fa70f2e\") " pod="openshift-ingress/router-default-5444994796-8x58q" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441931 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/59f2fda4-ad42-4bd7-89c8-9f1a8c54123c-plugins-dir\") pod \"csi-hostpathplugin-tcfrz\" (UID: \"59f2fda4-ad42-4bd7-89c8-9f1a8c54123c\") " pod="hostpath-provisioner/csi-hostpathplugin-tcfrz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.441952 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9815009-494f-4e87-9d55-da93dc61b078-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xkkcc\" (UID: \"d9815009-494f-4e87-9d55-da93dc61b078\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkkcc" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.442097 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l95rd\" (UniqueName: \"kubernetes.io/projected/172bc9e3-a420-4a31-a309-98c533dfdb4f-kube-api-access-l95rd\") pod \"kube-storage-version-migrator-operator-b67b599dd-qhnkw\" (UID: \"172bc9e3-a420-4a31-a309-98c533dfdb4f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qhnkw" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.442119 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx8r2\" (UniqueName: \"kubernetes.io/projected/c9b6c239-689b-49b7-bdd1-0f3bdc7e53d6-kube-api-access-fx8r2\") pod \"ingress-canary-42q5h\" (UID: \"c9b6c239-689b-49b7-bdd1-0f3bdc7e53d6\") " pod="openshift-ingress-canary/ingress-canary-42q5h" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.442141 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a6fe7e2-cdb3-49f0-8697-60de951eff58-config\") pod \"service-ca-operator-777779d784-lkgbs\" (UID: \"3a6fe7e2-cdb3-49f0-8697-60de951eff58\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lkgbs" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.442162 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/228aaa44-0de4-45e8-87d9-78ad4fa70f2e-stats-auth\") pod \"router-default-5444994796-8x58q\" (UID: \"228aaa44-0de4-45e8-87d9-78ad4fa70f2e\") " pod="openshift-ingress/router-default-5444994796-8x58q" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.442184 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aad21b5e-c192-46d8-9cbe-516f5dc5def2-webhook-cert\") pod \"packageserver-d55dfcdfc-84klz\" (UID: \"aad21b5e-c192-46d8-9cbe-516f5dc5def2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-84klz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.442200 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2x7q\" (UniqueName: \"kubernetes.io/projected/1eac8826-912b-4187-a3ce-3cbde72f1839-kube-api-access-f2x7q\") pod \"service-ca-9c57cc56f-whs6s\" (UID: \"1eac8826-912b-4187-a3ce-3cbde72f1839\") " pod="openshift-service-ca/service-ca-9c57cc56f-whs6s" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.442215 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/38f722c5-5d96-4c10-a4da-724f25123439-metrics-tls\") pod \"dns-default-cfmpf\" (UID: \"38f722c5-5d96-4c10-a4da-724f25123439\") " pod="openshift-dns/dns-default-cfmpf" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.442230 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/172bc9e3-a420-4a31-a309-98c533dfdb4f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-qhnkw\" (UID: \"172bc9e3-a420-4a31-a309-98c533dfdb4f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qhnkw" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.442271 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.442291 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a6fe7e2-cdb3-49f0-8697-60de951eff58-serving-cert\") pod \"service-ca-operator-777779d784-lkgbs\" (UID: \"3a6fe7e2-cdb3-49f0-8697-60de951eff58\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lkgbs" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.442306 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9b6c239-689b-49b7-bdd1-0f3bdc7e53d6-cert\") pod \"ingress-canary-42q5h\" (UID: \"c9b6c239-689b-49b7-bdd1-0f3bdc7e53d6\") " pod="openshift-ingress-canary/ingress-canary-42q5h" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.442324 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/59f2fda4-ad42-4bd7-89c8-9f1a8c54123c-registration-dir\") pod \"csi-hostpathplugin-tcfrz\" (UID: \"59f2fda4-ad42-4bd7-89c8-9f1a8c54123c\") " pod="hostpath-provisioner/csi-hostpathplugin-tcfrz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.442341 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/947ca08c-3484-42d7-9c5d-f8e7e1c7308d-node-bootstrap-token\") pod \"machine-config-server-cxgjh\" (UID: \"947ca08c-3484-42d7-9c5d-f8e7e1c7308d\") " pod="openshift-machine-config-operator/machine-config-server-cxgjh" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.442361 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8vcs\" (UniqueName: \"kubernetes.io/projected/e70b82d9-ed36-4c43-a868-e4d7f1b6ecd1-kube-api-access-c8vcs\") pod \"machine-config-operator-74547568cd-2447c\" (UID: \"e70b82d9-ed36-4c43-a868-e4d7f1b6ecd1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2447c" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.442388 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fknps\" (UniqueName: \"kubernetes.io/projected/3a6fe7e2-cdb3-49f0-8697-60de951eff58-kube-api-access-fknps\") pod \"service-ca-operator-777779d784-lkgbs\" (UID: \"3a6fe7e2-cdb3-49f0-8697-60de951eff58\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lkgbs" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.442421 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e70b82d9-ed36-4c43-a868-e4d7f1b6ecd1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2447c\" (UID: \"e70b82d9-ed36-4c43-a868-e4d7f1b6ecd1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2447c" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.443950 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/59f2fda4-ad42-4bd7-89c8-9f1a8c54123c-socket-dir\") pod \"csi-hostpathplugin-tcfrz\" (UID: \"59f2fda4-ad42-4bd7-89c8-9f1a8c54123c\") " pod="hostpath-provisioner/csi-hostpathplugin-tcfrz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.446617 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/228aaa44-0de4-45e8-87d9-78ad4fa70f2e-default-certificate\") pod \"router-default-5444994796-8x58q\" (UID: \"228aaa44-0de4-45e8-87d9-78ad4fa70f2e\") " pod="openshift-ingress/router-default-5444994796-8x58q" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.446703 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/59f2fda4-ad42-4bd7-89c8-9f1a8c54123c-mountpoint-dir\") pod \"csi-hostpathplugin-tcfrz\" (UID: \"59f2fda4-ad42-4bd7-89c8-9f1a8c54123c\") " pod="hostpath-provisioner/csi-hostpathplugin-tcfrz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.450059 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c61adaff-b097-44b1-b19e-daaf23012ac0-profile-collector-cert\") pod \"catalog-operator-68c6474976-nz7c2\" (UID: \"c61adaff-b097-44b1-b19e-daaf23012ac0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nz7c2" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.451145 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1974282f-c2f4-48cd-97e2-9e880203ef1c-config-volume\") pod \"collect-profiles-29322030-zjx2n\" (UID: \"1974282f-c2f4-48cd-97e2-9e880203ef1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-zjx2n" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.452791 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fdjkz"] Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.457380 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/59f2fda4-ad42-4bd7-89c8-9f1a8c54123c-plugins-dir\") pod \"csi-hostpathplugin-tcfrz\" (UID: \"59f2fda4-ad42-4bd7-89c8-9f1a8c54123c\") " pod="hostpath-provisioner/csi-hostpathplugin-tcfrz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.457712 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9d7314f2-4c1b-4b56-a55a-cf5c4b153c71-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mfjhb\" (UID: \"9d7314f2-4c1b-4b56-a55a-cf5c4b153c71\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mfjhb" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.458061 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/aad21b5e-c192-46d8-9cbe-516f5dc5def2-tmpfs\") pod \"packageserver-d55dfcdfc-84klz\" (UID: \"aad21b5e-c192-46d8-9cbe-516f5dc5def2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-84klz" Oct 01 12:39:25 crc kubenswrapper[4727]: E1001 12:39:25.459156 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:25.95913682 +0000 UTC m=+144.280491667 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.459374 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/172bc9e3-a420-4a31-a309-98c533dfdb4f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-qhnkw\" (UID: \"172bc9e3-a420-4a31-a309-98c533dfdb4f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qhnkw" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.459535 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/228aaa44-0de4-45e8-87d9-78ad4fa70f2e-service-ca-bundle\") pod \"router-default-5444994796-8x58q\" (UID: \"228aaa44-0de4-45e8-87d9-78ad4fa70f2e\") " pod="openshift-ingress/router-default-5444994796-8x58q" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.460272 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a6fe7e2-cdb3-49f0-8697-60de951eff58-config\") pod \"service-ca-operator-777779d784-lkgbs\" (UID: \"3a6fe7e2-cdb3-49f0-8697-60de951eff58\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lkgbs" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.460301 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/228aaa44-0de4-45e8-87d9-78ad4fa70f2e-metrics-certs\") pod \"router-default-5444994796-8x58q\" (UID: \"228aaa44-0de4-45e8-87d9-78ad4fa70f2e\") " pod="openshift-ingress/router-default-5444994796-8x58q" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.460415 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/59f2fda4-ad42-4bd7-89c8-9f1a8c54123c-registration-dir\") pod \"csi-hostpathplugin-tcfrz\" (UID: \"59f2fda4-ad42-4bd7-89c8-9f1a8c54123c\") " pod="hostpath-provisioner/csi-hostpathplugin-tcfrz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.461667 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjgnr\" (UniqueName: \"kubernetes.io/projected/ffda1e5d-0dc9-400a-97b2-e2f7e7773c04-kube-api-access-mjgnr\") pod \"package-server-manager-789f6589d5-g6h55\" (UID: \"ffda1e5d-0dc9-400a-97b2-e2f7e7773c04\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6h55" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.461730 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9815009-494f-4e87-9d55-da93dc61b078-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xkkcc\" (UID: \"d9815009-494f-4e87-9d55-da93dc61b078\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkkcc" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.462294 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c61adaff-b097-44b1-b19e-daaf23012ac0-srv-cert\") pod \"catalog-operator-68c6474976-nz7c2\" (UID: \"c61adaff-b097-44b1-b19e-daaf23012ac0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nz7c2" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.462741 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/172bc9e3-a420-4a31-a309-98c533dfdb4f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-qhnkw\" (UID: \"172bc9e3-a420-4a31-a309-98c533dfdb4f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qhnkw" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.463077 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1974282f-c2f4-48cd-97e2-9e880203ef1c-secret-volume\") pod \"collect-profiles-29322030-zjx2n\" (UID: \"1974282f-c2f4-48cd-97e2-9e880203ef1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-zjx2n" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.463118 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/947ca08c-3484-42d7-9c5d-f8e7e1c7308d-certs\") pod \"machine-config-server-cxgjh\" (UID: \"947ca08c-3484-42d7-9c5d-f8e7e1c7308d\") " pod="openshift-machine-config-operator/machine-config-server-cxgjh" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.463471 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aad21b5e-c192-46d8-9cbe-516f5dc5def2-webhook-cert\") pod \"packageserver-d55dfcdfc-84klz\" (UID: \"aad21b5e-c192-46d8-9cbe-516f5dc5def2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-84klz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.463567 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a6fe7e2-cdb3-49f0-8697-60de951eff58-serving-cert\") pod \"service-ca-operator-777779d784-lkgbs\" (UID: \"3a6fe7e2-cdb3-49f0-8697-60de951eff58\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lkgbs" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.465383 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/228aaa44-0de4-45e8-87d9-78ad4fa70f2e-stats-auth\") pod \"router-default-5444994796-8x58q\" (UID: \"228aaa44-0de4-45e8-87d9-78ad4fa70f2e\") " pod="openshift-ingress/router-default-5444994796-8x58q" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.466173 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e70b82d9-ed36-4c43-a868-e4d7f1b6ecd1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2447c\" (UID: \"e70b82d9-ed36-4c43-a868-e4d7f1b6ecd1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2447c" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.466193 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e70b82d9-ed36-4c43-a868-e4d7f1b6ecd1-proxy-tls\") pod \"machine-config-operator-74547568cd-2447c\" (UID: \"e70b82d9-ed36-4c43-a868-e4d7f1b6ecd1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2447c" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.468600 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/947ca08c-3484-42d7-9c5d-f8e7e1c7308d-node-bootstrap-token\") pod \"machine-config-server-cxgjh\" (UID: \"947ca08c-3484-42d7-9c5d-f8e7e1c7308d\") " pod="openshift-machine-config-operator/machine-config-server-cxgjh" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.469377 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c9b6c239-689b-49b7-bdd1-0f3bdc7e53d6-cert\") pod \"ingress-canary-42q5h\" (UID: \"c9b6c239-689b-49b7-bdd1-0f3bdc7e53d6\") " pod="openshift-ingress-canary/ingress-canary-42q5h" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.469634 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1eac8826-912b-4187-a3ce-3cbde72f1839-signing-cabundle\") pod \"service-ca-9c57cc56f-whs6s\" (UID: \"1eac8826-912b-4187-a3ce-3cbde72f1839\") " pod="openshift-service-ca/service-ca-9c57cc56f-whs6s" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.469730 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/38f722c5-5d96-4c10-a4da-724f25123439-config-volume\") pod \"dns-default-cfmpf\" (UID: \"38f722c5-5d96-4c10-a4da-724f25123439\") " pod="openshift-dns/dns-default-cfmpf" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.469927 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e70b82d9-ed36-4c43-a868-e4d7f1b6ecd1-images\") pod \"machine-config-operator-74547568cd-2447c\" (UID: \"e70b82d9-ed36-4c43-a868-e4d7f1b6ecd1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2447c" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.473080 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hls2x"] Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.473143 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-w2j4h"] Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.475812 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/38f722c5-5d96-4c10-a4da-724f25123439-metrics-tls\") pod \"dns-default-cfmpf\" (UID: \"38f722c5-5d96-4c10-a4da-724f25123439\") " pod="openshift-dns/dns-default-cfmpf" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.476126 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aad21b5e-c192-46d8-9cbe-516f5dc5def2-apiservice-cert\") pod \"packageserver-d55dfcdfc-84klz\" (UID: \"aad21b5e-c192-46d8-9cbe-516f5dc5def2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-84klz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.481266 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1eac8826-912b-4187-a3ce-3cbde72f1839-signing-key\") pod \"service-ca-9c57cc56f-whs6s\" (UID: \"1eac8826-912b-4187-a3ce-3cbde72f1839\") " pod="openshift-service-ca/service-ca-9c57cc56f-whs6s" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.506243 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dwszm"] Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.506935 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qktm7" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.507215 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzfzb\" (UniqueName: \"kubernetes.io/projected/59f2fda4-ad42-4bd7-89c8-9f1a8c54123c-kube-api-access-jzfzb\") pod \"csi-hostpathplugin-tcfrz\" (UID: \"59f2fda4-ad42-4bd7-89c8-9f1a8c54123c\") " pod="hostpath-provisioner/csi-hostpathplugin-tcfrz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.521441 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-p7692" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.525416 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppwfh\" (UniqueName: \"kubernetes.io/projected/38f722c5-5d96-4c10-a4da-724f25123439-kube-api-access-ppwfh\") pod \"dns-default-cfmpf\" (UID: \"38f722c5-5d96-4c10-a4da-724f25123439\") " pod="openshift-dns/dns-default-cfmpf" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.539389 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snndw\" (UniqueName: \"kubernetes.io/projected/1974282f-c2f4-48cd-97e2-9e880203ef1c-kube-api-access-snndw\") pod \"collect-profiles-29322030-zjx2n\" (UID: \"1974282f-c2f4-48cd-97e2-9e880203ef1c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-zjx2n" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.543659 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-28zwl"] Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.544462 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:25 crc kubenswrapper[4727]: E1001 12:39:25.545155 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:26.045124677 +0000 UTC m=+144.366479514 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.566166 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpkgb\" (UniqueName: \"kubernetes.io/projected/d9815009-494f-4e87-9d55-da93dc61b078-kube-api-access-hpkgb\") pod \"control-plane-machine-set-operator-78cbb6b69f-xkkcc\" (UID: \"d9815009-494f-4e87-9d55-da93dc61b078\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkkcc" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.572770 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sz95m" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.586264 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zsv9\" (UniqueName: \"kubernetes.io/projected/947ca08c-3484-42d7-9c5d-f8e7e1c7308d-kube-api-access-5zsv9\") pod \"machine-config-server-cxgjh\" (UID: \"947ca08c-3484-42d7-9c5d-f8e7e1c7308d\") " pod="openshift-machine-config-operator/machine-config-server-cxgjh" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.589038 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6rldk" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.597675 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lpgxf" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.600824 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4npfh\" (UniqueName: \"kubernetes.io/projected/228aaa44-0de4-45e8-87d9-78ad4fa70f2e-kube-api-access-4npfh\") pod \"router-default-5444994796-8x58q\" (UID: \"228aaa44-0de4-45e8-87d9-78ad4fa70f2e\") " pod="openshift-ingress/router-default-5444994796-8x58q" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.619756 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlbft\" (UniqueName: \"kubernetes.io/projected/c61adaff-b097-44b1-b19e-daaf23012ac0-kube-api-access-qlbft\") pod \"catalog-operator-68c6474976-nz7c2\" (UID: \"c61adaff-b097-44b1-b19e-daaf23012ac0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nz7c2" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.634442 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkkcc" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.638622 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjqbw\" (UniqueName: \"kubernetes.io/projected/aad21b5e-c192-46d8-9cbe-516f5dc5def2-kube-api-access-qjqbw\") pod \"packageserver-d55dfcdfc-84klz\" (UID: \"aad21b5e-c192-46d8-9cbe-516f5dc5def2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-84klz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.653414 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.653424 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6h55" Oct 01 12:39:25 crc kubenswrapper[4727]: E1001 12:39:25.653801 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:26.153784646 +0000 UTC m=+144.475139583 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.661322 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-zjx2n" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.661355 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7vlt\" (UniqueName: \"kubernetes.io/projected/9d7314f2-4c1b-4b56-a55a-cf5c4b153c71-kube-api-access-p7vlt\") pod \"multus-admission-controller-857f4d67dd-mfjhb\" (UID: \"9d7314f2-4c1b-4b56-a55a-cf5c4b153c71\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mfjhb" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.676968 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2x7q\" (UniqueName: \"kubernetes.io/projected/1eac8826-912b-4187-a3ce-3cbde72f1839-kube-api-access-f2x7q\") pod \"service-ca-9c57cc56f-whs6s\" (UID: \"1eac8826-912b-4187-a3ce-3cbde72f1839\") " pod="openshift-service-ca/service-ca-9c57cc56f-whs6s" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.694934 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8x58q" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.705717 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cfmpf" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.709077 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l95rd\" (UniqueName: \"kubernetes.io/projected/172bc9e3-a420-4a31-a309-98c533dfdb4f-kube-api-access-l95rd\") pod \"kube-storage-version-migrator-operator-b67b599dd-qhnkw\" (UID: \"172bc9e3-a420-4a31-a309-98c533dfdb4f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qhnkw" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.719057 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx8r2\" (UniqueName: \"kubernetes.io/projected/c9b6c239-689b-49b7-bdd1-0f3bdc7e53d6-kube-api-access-fx8r2\") pod \"ingress-canary-42q5h\" (UID: \"c9b6c239-689b-49b7-bdd1-0f3bdc7e53d6\") " pod="openshift-ingress-canary/ingress-canary-42q5h" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.721934 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tcfrz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.738102 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-cxgjh" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.742467 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-42q5h" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.752785 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8vcs\" (UniqueName: \"kubernetes.io/projected/e70b82d9-ed36-4c43-a868-e4d7f1b6ecd1-kube-api-access-c8vcs\") pod \"machine-config-operator-74547568cd-2447c\" (UID: \"e70b82d9-ed36-4c43-a868-e4d7f1b6ecd1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2447c" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.754058 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:25 crc kubenswrapper[4727]: E1001 12:39:25.754423 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:26.254279645 +0000 UTC m=+144.575634482 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.754870 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:25 crc kubenswrapper[4727]: E1001 12:39:25.755253 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:26.255243698 +0000 UTC m=+144.576598535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.762469 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mfjhb" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.768379 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fknps\" (UniqueName: \"kubernetes.io/projected/3a6fe7e2-cdb3-49f0-8697-60de951eff58-kube-api-access-fknps\") pod \"service-ca-operator-777779d784-lkgbs\" (UID: \"3a6fe7e2-cdb3-49f0-8697-60de951eff58\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lkgbs" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.857112 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:25 crc kubenswrapper[4727]: E1001 12:39:25.857514 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:26.357493198 +0000 UTC m=+144.678848035 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.885546 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qktm7"] Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.904817 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2447c" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.920189 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nz7c2" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.929020 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-whs6s" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.929461 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-84klz" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.953561 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lkgbs" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.959216 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:25 crc kubenswrapper[4727]: I1001 12:39:25.976557 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qhnkw" Oct 01 12:39:25 crc kubenswrapper[4727]: E1001 12:39:25.976802 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:26.476776088 +0000 UTC m=+144.798130925 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.031825 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-p7692"] Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.067075 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:26 crc kubenswrapper[4727]: E1001 12:39:26.068125 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:26.568097026 +0000 UTC m=+144.889451863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.127375 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sz95m"] Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.171134 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:26 crc kubenswrapper[4727]: E1001 12:39:26.171906 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:26.671889308 +0000 UTC m=+144.993244145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.231474 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7v5w8" event={"ID":"ad83fce3-f3c7-42eb-8c3e-1b2d964a6dd2","Type":"ContainerStarted","Data":"3682f27a12f0fdbebe465d749f5de9c902d8e0e44eda906836d430d9a713191e"} Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.273499 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" event={"ID":"54454532-1909-4aa9-b17e-f244107b202e","Type":"ContainerStarted","Data":"49b19a22b37f59b0b05e4bd54601c53febaec7bd95b8f8fbbfed49465a3d6506"} Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.275688 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:26 crc kubenswrapper[4727]: E1001 12:39:26.276058 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:26.776032833 +0000 UTC m=+145.097387720 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.276358 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:26 crc kubenswrapper[4727]: E1001 12:39:26.277065 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:26.777056447 +0000 UTC m=+145.098411274 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.288399 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mhnsv" event={"ID":"e491c69e-2845-4958-8b77-ba6aa4afc6fa","Type":"ContainerStarted","Data":"984780e6e924a8a1164b9242b65106eb404f9d5b331d569e8e8f45cb025a3a3c"} Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.306930 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-w2j4h" event={"ID":"393e430f-d192-4a64-a39b-fba4a1b1897e","Type":"ContainerStarted","Data":"c4d2caf665b1b2f3896c607aba67ffc6a9b5b42e14b41acca33a8c507421ab16"} Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.322763 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vbhpd" event={"ID":"32ddbc86-f1a5-49b8-9418-f02063aa6637","Type":"ContainerStarted","Data":"5b3d234036b5280adf1baf78015b74719b9029e5f6239f77839eaf5b96b50b12"} Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.322827 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-vbhpd" Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.322843 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vbhpd" event={"ID":"32ddbc86-f1a5-49b8-9418-f02063aa6637","Type":"ContainerStarted","Data":"889cc060ecb8864e812c17d29961b186bd6a2a36f8733b1bc5afe43dd9f86869"} Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.330319 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n87vl" event={"ID":"d15984be-56fb-4ab5-9ede-fcd1bf6aefce","Type":"ContainerStarted","Data":"6cb8bc7bd0c819ecc0e2e4d40b805d8fb104d70e4d0a1130b5ec8be44691b68f"} Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.333469 4727 patch_prober.go:28] interesting pod/console-operator-58897d9998-vbhpd container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.334039 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-vbhpd" podUID="32ddbc86-f1a5-49b8-9418-f02063aa6637" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.377438 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:26 crc kubenswrapper[4727]: E1001 12:39:26.378242 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:26.87819097 +0000 UTC m=+145.199546007 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.378347 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:26 crc kubenswrapper[4727]: E1001 12:39:26.378819 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:26.87880696 +0000 UTC m=+145.200161797 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.391884 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kt4rr" event={"ID":"e6df7b62-9be2-42bf-a5bb-15dfc389a34b","Type":"ContainerStarted","Data":"186796e2dd4c833835ec9a64e04fcb3274b0992d2c509331b6098a160e05d72a"} Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.391947 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qktm7" event={"ID":"61ea519c-4d97-4e3e-b932-51a3f8e2e07f","Type":"ContainerStarted","Data":"e18cfd5018fd00ea8adcf57bc58d5363038356093590807b0f6eeedb50e16c26"} Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.401441 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-psvph" event={"ID":"e419dff2-2c6a-4c89-8d99-0374397903b1","Type":"ContainerStarted","Data":"541aa2dbc8e686cc3699fd47ccd3b00cf98c9c0c3bec815b1cf3a0bb9a7e3080"} Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.404388 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322030-zjx2n"] Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.482198 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-28zwl" event={"ID":"d774a5b7-171a-47c7-8d71-9497eb856102","Type":"ContainerStarted","Data":"03fb2e58696b63c4d2b62a7698fe1762b15ff04c26b3859d1c79f59aec27e230"} Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.484058 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:26 crc kubenswrapper[4727]: E1001 12:39:26.484287 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:26.984235308 +0000 UTC m=+145.305590145 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.484381 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:26 crc kubenswrapper[4727]: E1001 12:39:26.485417 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:26.985399488 +0000 UTC m=+145.306754325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.488490 4727 generic.go:334] "Generic (PLEG): container finished" podID="fe4ee3a0-3756-49f8-88f4-21bc1113845d" containerID="fd1a26a6c7432638cf78a9c705dc53eb3a88cdef19f886caaf60ae4d170df17c" exitCode=0 Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.488564 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4gnf" event={"ID":"fe4ee3a0-3756-49f8-88f4-21bc1113845d","Type":"ContainerDied","Data":"fd1a26a6c7432638cf78a9c705dc53eb3a88cdef19f886caaf60ae4d170df17c"} Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.488595 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4gnf" event={"ID":"fe4ee3a0-3756-49f8-88f4-21bc1113845d","Type":"ContainerStarted","Data":"72c5f81959664cb665c236f60562221bef772d8aa322c5660b0a10660b577a4b"} Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.498911 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6rldk"] Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.499722 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tcfrz"] Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.525585 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lpgxf"] Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.525672 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" event={"ID":"b19dbbbe-8e00-400e-8499-7ebdf954faa2","Type":"ContainerStarted","Data":"78518dc37f2e01274b71ec581d3a1e5516b32695203a1354366381e79be18602"} Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.569041 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hls2x" event={"ID":"314cd705-8127-4d02-b9c2-d2c731733ec3","Type":"ContainerStarted","Data":"511f4f5fd01d84e6b5fad777711ecc78772fb43db43a0401dc67030180744c12"} Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.590750 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-89tz6" event={"ID":"366b7e92-ea45-4052-8ddc-9540d534a7ad","Type":"ContainerStarted","Data":"d6e6cc3e00f68f1a3411ffb99a4bd7381075b21f032ba8875d5d889f2c3922e0"} Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.591236 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-89tz6" event={"ID":"366b7e92-ea45-4052-8ddc-9540d534a7ad","Type":"ContainerStarted","Data":"034be7a53c854ca94b0b98036eb730dad3d32ff6bc0701a35eea425000108c89"} Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.604582 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6h55"] Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.605131 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:26 crc kubenswrapper[4727]: E1001 12:39:26.612778 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:27.112724192 +0000 UTC m=+145.434079029 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.648923 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h82nm" event={"ID":"13bfb813-f506-4bc0-9296-6a3d756968a7","Type":"ContainerStarted","Data":"221551f86d386641d8f1a90b593dadcbb798a86f2e092c44f069a044669d23f3"} Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.664035 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-t4nz6" event={"ID":"286216d2-0a22-42ea-bbc7-40fbe51a6f98","Type":"ContainerStarted","Data":"bd7fd93ed81f15ae4ad58d577ceff23e071191c217a062633e332b823a0c618e"} Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.676384 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fdjkz" event={"ID":"358a6cda-2c70-4bf3-847a-8f5417bf13ce","Type":"ContainerStarted","Data":"1088fb9a6a7ac9de15b2c165ecbe4cfd0f3ae5232bdf283814091234f0a6550d"} Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.692032 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" event={"ID":"4d871c42-cfe9-4f9d-80b3-2ccef1246050","Type":"ContainerStarted","Data":"f183f45567835122838b4260c0f36387342d315dfd8d6453b0e6c438c50145d6"} Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.692628 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.716074 4727 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-zhq2q container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.716150 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" podUID="4d871c42-cfe9-4f9d-80b3-2ccef1246050" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.720893 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:26 crc kubenswrapper[4727]: E1001 12:39:26.725043 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:27.225022103 +0000 UTC m=+145.546377030 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.744421 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djszs" event={"ID":"01db707e-986e-4b34-ba57-8f184b7ebcc5","Type":"ContainerStarted","Data":"9aaf553c42f14bbc99408105eb785eeda59d809cf7cd79d11cd0f561692acf3b"} Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.768054 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-whs6s"] Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.800968 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn" event={"ID":"beaa2552-3c16-4136-99e5-50e5eb116f04","Type":"ContainerStarted","Data":"ff7d2574f639b4f4a3d18781a3cc7d635d533c4202efd9fc9758c28daa1ff0a2"} Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.801314 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn" Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.808563 4727 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-w8gvn container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.808790 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn" podUID="beaa2552-3c16-4136-99e5-50e5eb116f04" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.824625 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:26 crc kubenswrapper[4727]: E1001 12:39:26.825546 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:27.325529064 +0000 UTC m=+145.646883901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.859018 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-42q5h"] Oct 01 12:39:26 crc kubenswrapper[4727]: W1001 12:39:26.885241 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1eac8826_912b_4187_a3ce_3cbde72f1839.slice/crio-04babb879a45ab492b15a0c6599029ce3a9c19c84284f935a99319c3d81c0e58 WatchSource:0}: Error finding container 04babb879a45ab492b15a0c6599029ce3a9c19c84284f935a99319c3d81c0e58: Status 404 returned error can't find the container with id 04babb879a45ab492b15a0c6599029ce3a9c19c84284f935a99319c3d81c0e58 Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.909666 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cfmpf"] Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.926390 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:26 crc kubenswrapper[4727]: E1001 12:39:26.929298 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:27.429267414 +0000 UTC m=+145.750622451 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:26 crc kubenswrapper[4727]: I1001 12:39:26.943605 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mfjhb"] Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:26.995919 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkkcc"] Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:27.032126 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:27 crc kubenswrapper[4727]: E1001 12:39:27.032397 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:27.532383644 +0000 UTC m=+145.853738481 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:27.036235 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-lkgbs"] Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:27.042859 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-vbhpd" podStartSLOduration=123.042822249 podStartE2EDuration="2m3.042822249s" podCreationTimestamp="2025-10-01 12:37:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:27.036918798 +0000 UTC m=+145.358273655" watchObservedRunningTime="2025-10-01 12:39:27.042822249 +0000 UTC m=+145.364177106" Oct 01 12:39:27 crc kubenswrapper[4727]: W1001 12:39:27.089494 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38f722c5_5d96_4c10_a4da_724f25123439.slice/crio-43a85cb448d8491040c51e86cc1c25c835ff50ca75dbf0c87d69835275cfd66e WatchSource:0}: Error finding container 43a85cb448d8491040c51e86cc1c25c835ff50ca75dbf0c87d69835275cfd66e: Status 404 returned error can't find the container with id 43a85cb448d8491040c51e86cc1c25c835ff50ca75dbf0c87d69835275cfd66e Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:27.134368 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:27 crc kubenswrapper[4727]: E1001 12:39:27.134819 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:27.634805919 +0000 UTC m=+145.956160756 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:27.165976 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n87vl" podStartSLOduration=122.165952639 podStartE2EDuration="2m2.165952639s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:27.160651269 +0000 UTC m=+145.482006106" watchObservedRunningTime="2025-10-01 12:39:27.165952639 +0000 UTC m=+145.487307496" Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:27.236803 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:27 crc kubenswrapper[4727]: E1001 12:39:27.237305 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:27.737283417 +0000 UTC m=+146.058638254 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:27.255135 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-psvph" podStartSLOduration=122.255101153 podStartE2EDuration="2m2.255101153s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:27.212137081 +0000 UTC m=+145.533491938" watchObservedRunningTime="2025-10-01 12:39:27.255101153 +0000 UTC m=+145.576455990" Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:27.342755 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:27 crc kubenswrapper[4727]: E1001 12:39:27.344558 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:27.844532567 +0000 UTC m=+146.165887404 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:27.422399 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t8f6l" podStartSLOduration=122.422378857 podStartE2EDuration="2m2.422378857s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:27.366624209 +0000 UTC m=+145.687979056" watchObservedRunningTime="2025-10-01 12:39:27.422378857 +0000 UTC m=+145.743733694" Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:27.436776 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nz7c2"] Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:27.457177 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2447c"] Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:27.463085 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:27 crc kubenswrapper[4727]: E1001 12:39:27.463224 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:27.963199175 +0000 UTC m=+146.284554012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:27.463423 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:27 crc kubenswrapper[4727]: E1001 12:39:27.463737 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:27.963724434 +0000 UTC m=+146.285079261 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:27.501836 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-84klz"] Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:27.501843 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn" podStartSLOduration=122.50182385 podStartE2EDuration="2m2.50182385s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:27.498488177 +0000 UTC m=+145.819843014" watchObservedRunningTime="2025-10-01 12:39:27.50182385 +0000 UTC m=+145.823178687" Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:27.523351 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qhnkw"] Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:27.564629 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:27 crc kubenswrapper[4727]: E1001 12:39:27.564977 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:28.064960129 +0000 UTC m=+146.386314966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:27.583395 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" podStartSLOduration=122.583380586 podStartE2EDuration="2m2.583380586s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:27.530935121 +0000 UTC m=+145.852289958" watchObservedRunningTime="2025-10-01 12:39:27.583380586 +0000 UTC m=+145.904735423" Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:27.584337 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-89tz6" podStartSLOduration=123.584333028 podStartE2EDuration="2m3.584333028s" podCreationTimestamp="2025-10-01 12:37:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:27.575634932 +0000 UTC m=+145.896989779" watchObservedRunningTime="2025-10-01 12:39:27.584333028 +0000 UTC m=+145.905687865" Oct 01 12:39:27 crc kubenswrapper[4727]: W1001 12:39:27.585626 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc61adaff_b097_44b1_b19e_daaf23012ac0.slice/crio-707b5450d209c98c70305349a85489b9f5c676d236463ec927c37ebffd168a17 WatchSource:0}: Error finding container 707b5450d209c98c70305349a85489b9f5c676d236463ec927c37ebffd168a17: Status 404 returned error can't find the container with id 707b5450d209c98c70305349a85489b9f5c676d236463ec927c37ebffd168a17 Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:27.612400 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-t4nz6" podStartSLOduration=122.612382313 podStartE2EDuration="2m2.612382313s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:27.608520521 +0000 UTC m=+145.929875358" watchObservedRunningTime="2025-10-01 12:39:27.612382313 +0000 UTC m=+145.933737150" Oct 01 12:39:27 crc kubenswrapper[4727]: W1001 12:39:27.625365 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaad21b5e_c192_46d8_9cbe_516f5dc5def2.slice/crio-6b7a232da619e3cbbdcca23dba689d1d8e83675aa5f32c5f70c52d25a777fcd9 WatchSource:0}: Error finding container 6b7a232da619e3cbbdcca23dba689d1d8e83675aa5f32c5f70c52d25a777fcd9: Status 404 returned error can't find the container with id 6b7a232da619e3cbbdcca23dba689d1d8e83675aa5f32c5f70c52d25a777fcd9 Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:27.650769 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" podStartSLOduration=122.650752319 podStartE2EDuration="2m2.650752319s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:27.648026186 +0000 UTC m=+145.969381033" watchObservedRunningTime="2025-10-01 12:39:27.650752319 +0000 UTC m=+145.972107146" Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:27.666581 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:27 crc kubenswrapper[4727]: E1001 12:39:27.667014 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:28.166981401 +0000 UTC m=+146.488336238 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:27.749639 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-fdjkz" podStartSLOduration=122.749618583 podStartE2EDuration="2m2.749618583s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:27.702255352 +0000 UTC m=+146.023610209" watchObservedRunningTime="2025-10-01 12:39:27.749618583 +0000 UTC m=+146.070973420" Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:27.750819 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djszs" podStartSLOduration=122.750813255 podStartE2EDuration="2m2.750813255s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:27.749258841 +0000 UTC m=+146.070613688" watchObservedRunningTime="2025-10-01 12:39:27.750813255 +0000 UTC m=+146.072168092" Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:27.769245 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:27 crc kubenswrapper[4727]: E1001 12:39:27.769905 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:28.269830851 +0000 UTC m=+146.591185688 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:27.873171 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:27 crc kubenswrapper[4727]: E1001 12:39:27.873752 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:28.373739658 +0000 UTC m=+146.695094495 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:27 crc kubenswrapper[4727]: I1001 12:39:27.978634 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:27 crc kubenswrapper[4727]: E1001 12:39:27.978949 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:28.478933888 +0000 UTC m=+146.800288725 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.064128 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lpgxf" event={"ID":"784aadbb-0b96-4110-8ca3-7c38ca2456e4","Type":"ContainerStarted","Data":"c44732d49895d49543bd003a6003dd80bb8c5f8d73032fcc31a3b3fee718a296"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.081059 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:28 crc kubenswrapper[4727]: E1001 12:39:28.081361 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:28.581350234 +0000 UTC m=+146.902705071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.086940 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" event={"ID":"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d","Type":"ContainerStarted","Data":"4cead2ce495313a68c979c09b37c9c07f2e32140b562674d3125fb0c032154dd"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.087015 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" event={"ID":"2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d","Type":"ContainerStarted","Data":"3f0f901d0aafcf9a0f82f25e2bc2d00f7ba08dbf6122d00bc2526e43e8ee036c"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.115960 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-p7692" event={"ID":"6677870d-3e55-4f26-a052-4bfbb396b164","Type":"ContainerStarted","Data":"8fb60e4cb841c8f574e82893b9eb7e2f299cfd3dcf431e98bee20b15152e54fb"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.116096 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-p7692" event={"ID":"6677870d-3e55-4f26-a052-4bfbb396b164","Type":"ContainerStarted","Data":"8a8fc54649ad20c8b9184dd447fadae1dd6cab75552fef7b45d4936508978229"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.127966 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-cxgjh" event={"ID":"947ca08c-3484-42d7-9c5d-f8e7e1c7308d","Type":"ContainerStarted","Data":"ef548912af801b2e7b032998d3d2aaf148cf291c3fc963f88cbaf0a79414e87c"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.128057 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-cxgjh" event={"ID":"947ca08c-3484-42d7-9c5d-f8e7e1c7308d","Type":"ContainerStarted","Data":"2fe5fa187c07f7a9b564df38bac0d40dbed9ed6dc51e0231b17509964f540aaf"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.140203 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h82nm" podStartSLOduration=123.140184066 podStartE2EDuration="2m3.140184066s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:27.791684645 +0000 UTC m=+146.113039482" watchObservedRunningTime="2025-10-01 12:39:28.140184066 +0000 UTC m=+146.461538903" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.165923 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7v5w8" event={"ID":"ad83fce3-f3c7-42eb-8c3e-1b2d964a6dd2","Type":"ContainerStarted","Data":"a8d45e5fe85eb24c710c5b75991ec908b42b5c12f841fdd6162d0bbaa938500c"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.173432 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4gnf" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.177277 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mhnsv" event={"ID":"e491c69e-2845-4958-8b77-ba6aa4afc6fa","Type":"ContainerStarted","Data":"1a6a37bfcbce4e92ce245f5b3029861eddaf99919482f07886056c6a93d97636"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.177331 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mhnsv" event={"ID":"e491c69e-2845-4958-8b77-ba6aa4afc6fa","Type":"ContainerStarted","Data":"5c7ce7e297f1f7d76c94f3559cc370f4f50076360eb92b29828c320c6753c312"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.188622 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.188850 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-p7692" podStartSLOduration=123.188833252 podStartE2EDuration="2m3.188833252s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:28.187339921 +0000 UTC m=+146.508694768" watchObservedRunningTime="2025-10-01 12:39:28.188833252 +0000 UTC m=+146.510188089" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.190278 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" podStartSLOduration=123.190267611 podStartE2EDuration="2m3.190267611s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:28.14353303 +0000 UTC m=+146.464887877" watchObservedRunningTime="2025-10-01 12:39:28.190267611 +0000 UTC m=+146.511622438" Oct 01 12:39:28 crc kubenswrapper[4727]: E1001 12:39:28.190571 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:28.69055344 +0000 UTC m=+147.011908317 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.228301 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-cxgjh" podStartSLOduration=6.228275154 podStartE2EDuration="6.228275154s" podCreationTimestamp="2025-10-01 12:39:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:28.217925142 +0000 UTC m=+146.539279979" watchObservedRunningTime="2025-10-01 12:39:28.228275154 +0000 UTC m=+146.549629991" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.268453 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-84klz" event={"ID":"aad21b5e-c192-46d8-9cbe-516f5dc5def2","Type":"ContainerStarted","Data":"6b7a232da619e3cbbdcca23dba689d1d8e83675aa5f32c5f70c52d25a777fcd9"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.274901 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mhnsv" podStartSLOduration=124.274854929 podStartE2EDuration="2m4.274854929s" podCreationTimestamp="2025-10-01 12:37:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:28.273441831 +0000 UTC m=+146.594796688" watchObservedRunningTime="2025-10-01 12:39:28.274854929 +0000 UTC m=+146.596209766" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.300739 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:28 crc kubenswrapper[4727]: E1001 12:39:28.301539 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:28.801525917 +0000 UTC m=+147.122880754 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.306143 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7v5w8" podStartSLOduration=123.306115733 podStartE2EDuration="2m3.306115733s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:28.294465127 +0000 UTC m=+146.615819984" watchObservedRunningTime="2025-10-01 12:39:28.306115733 +0000 UTC m=+146.627470590" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.348157 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kt4rr" event={"ID":"e6df7b62-9be2-42bf-a5bb-15dfc389a34b","Type":"ContainerStarted","Data":"b50e964c4305e9b1e14ab9481a980e6805fd7ffa48f87e857d128cfd759e0aa9"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.350413 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4gnf" podStartSLOduration=124.35039636 podStartE2EDuration="2m4.35039636s" podCreationTimestamp="2025-10-01 12:37:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:28.347859834 +0000 UTC m=+146.669214671" watchObservedRunningTime="2025-10-01 12:39:28.35039636 +0000 UTC m=+146.671751197" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.404649 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:28 crc kubenswrapper[4727]: E1001 12:39:28.405771 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:28.905754064 +0000 UTC m=+147.227108901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.409435 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qhnkw" event={"ID":"172bc9e3-a420-4a31-a309-98c533dfdb4f","Type":"ContainerStarted","Data":"2ae238283ff598761127086648d99736247ae594edb47cc1e3d53f996b7af00b"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.412029 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8x58q" event={"ID":"228aaa44-0de4-45e8-87d9-78ad4fa70f2e","Type":"ContainerStarted","Data":"c53d9968645b502ab58db3226e297de97a312bc1be34dcc1645d04d186592aa3"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.412084 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8x58q" event={"ID":"228aaa44-0de4-45e8-87d9-78ad4fa70f2e","Type":"ContainerStarted","Data":"09989ebfe73406000ba0b9b2bd2401de11bd81238b4e550c164d9b56199aebb2"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.417534 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kt4rr" podStartSLOduration=125.417489704 podStartE2EDuration="2m5.417489704s" podCreationTimestamp="2025-10-01 12:37:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:28.392024177 +0000 UTC m=+146.713379034" watchObservedRunningTime="2025-10-01 12:39:28.417489704 +0000 UTC m=+146.738844561" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.428017 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" event={"ID":"54454532-1909-4aa9-b17e-f244107b202e","Type":"ContainerStarted","Data":"de1cf371611a3b346da44f6b17ea101fb8ef51d4fa0defab23a5dc923d80756b"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.436079 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.445737 4727 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-dwszm container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" start-of-body= Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.445807 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" podUID="54454532-1909-4aa9-b17e-f244107b202e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.459094 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-8x58q" podStartSLOduration=123.459056898 podStartE2EDuration="2m3.459056898s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:28.453308782 +0000 UTC m=+146.774663619" watchObservedRunningTime="2025-10-01 12:39:28.459056898 +0000 UTC m=+146.780411725" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.472038 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fdjkz" event={"ID":"358a6cda-2c70-4bf3-847a-8f5417bf13ce","Type":"ContainerStarted","Data":"9e5ab6fb3a0efa7e9487de53c820a753c350868d6c62f93a5bb74edc50d829c4"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.477397 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mfjhb" event={"ID":"9d7314f2-4c1b-4b56-a55a-cf5c4b153c71","Type":"ContainerStarted","Data":"6bfe39db5d97a9be890034ef91d81f59953223f5605681625a50a3ca7d590f87"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.497700 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qktm7" event={"ID":"61ea519c-4d97-4e3e-b932-51a3f8e2e07f","Type":"ContainerStarted","Data":"fee36f592a0d6f4250482e4aa38eb6c5e7cea98fab98696b686be749fd0da580"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.498920 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-qktm7" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.502894 4727 patch_prober.go:28] interesting pod/downloads-7954f5f757-qktm7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.502935 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qktm7" podUID="61ea519c-4d97-4e3e-b932-51a3f8e2e07f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.510613 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:28 crc kubenswrapper[4727]: E1001 12:39:28.513498 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:29.013480331 +0000 UTC m=+147.334835238 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.542908 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-qktm7" podStartSLOduration=124.542892512 podStartE2EDuration="2m4.542892512s" podCreationTimestamp="2025-10-01 12:37:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:28.54079573 +0000 UTC m=+146.862150567" watchObservedRunningTime="2025-10-01 12:39:28.542892512 +0000 UTC m=+146.864247349" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.543797 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" podStartSLOduration=123.543789952 podStartE2EDuration="2m3.543789952s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:28.492349382 +0000 UTC m=+146.813704229" watchObservedRunningTime="2025-10-01 12:39:28.543789952 +0000 UTC m=+146.865144789" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.576365 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-28zwl" event={"ID":"d774a5b7-171a-47c7-8d71-9497eb856102","Type":"ContainerStarted","Data":"d2e091026ceabf8d0376dde9f91a474fc55b921678f79c39f3381f9de1dd6fb6"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.576699 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-28zwl" event={"ID":"d774a5b7-171a-47c7-8d71-9497eb856102","Type":"ContainerStarted","Data":"4c4e5844c83ca288388c56abad57d0287aa3b38bc325d74d83bb61a7791be3f0"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.608218 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-42q5h" event={"ID":"c9b6c239-689b-49b7-bdd1-0f3bdc7e53d6","Type":"ContainerStarted","Data":"8d3eba9b526cf511dde14553ce8cffabb65ae70a71d320ada5260fd248b9613c"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.611479 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:28 crc kubenswrapper[4727]: E1001 12:39:28.612570 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:29.112555242 +0000 UTC m=+147.433910079 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.649265 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkkcc" event={"ID":"d9815009-494f-4e87-9d55-da93dc61b078","Type":"ContainerStarted","Data":"c264e694562a9aba87443f057136e9192e0a68f59e6f0d6cf9d84f9bc19c249f"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.662743 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sz95m" event={"ID":"ea389964-1da2-4ade-8772-b8bd1a76cc27","Type":"ContainerStarted","Data":"a625a79f3dfd18091559a9fc965ba1970c25ed879e4c7a021028f4be86bdd5e2"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.662801 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sz95m" event={"ID":"ea389964-1da2-4ade-8772-b8bd1a76cc27","Type":"ContainerStarted","Data":"0c0c4134a8e85925c85f0b66719ab66af848303596747f568d95b33fa90f7b40"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.666542 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-sz95m" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.673743 4727 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-sz95m container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.673815 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-sz95m" podUID="ea389964-1da2-4ade-8772-b8bd1a76cc27" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.679993 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-28zwl" podStartSLOduration=123.679968007 podStartE2EDuration="2m3.679968007s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:28.613919479 +0000 UTC m=+146.935274316" watchObservedRunningTime="2025-10-01 12:39:28.679968007 +0000 UTC m=+147.001322844" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.683149 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkkcc" podStartSLOduration=123.683139725 podStartE2EDuration="2m3.683139725s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:28.68033968 +0000 UTC m=+147.001694517" watchObservedRunningTime="2025-10-01 12:39:28.683139725 +0000 UTC m=+147.004494562" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.684731 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6h55" event={"ID":"ffda1e5d-0dc9-400a-97b2-e2f7e7773c04","Type":"ContainerStarted","Data":"50faa71c2b358e5e99b97ded8db60bb8d5cdaa957d4b560368a147a1fd437e4d"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.690361 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-djszs" event={"ID":"01db707e-986e-4b34-ba57-8f184b7ebcc5","Type":"ContainerStarted","Data":"ddcd9be602a15cd3b4eb900ece3c997de1404b80af016c67ff7db92ee7475c88"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.697225 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-8x58q" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.701561 4727 patch_prober.go:28] interesting pod/router-default-5444994796-8x58q container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.701623 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8x58q" podUID="228aaa44-0de4-45e8-87d9-78ad4fa70f2e" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.713492 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-zjx2n" event={"ID":"1974282f-c2f4-48cd-97e2-9e880203ef1c","Type":"ContainerStarted","Data":"d9fb1e4352d7559f140d9f5fdb64bd79ffc4b43f2e90b5d94ac7269b7e83d9b5"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.713546 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-zjx2n" event={"ID":"1974282f-c2f4-48cd-97e2-9e880203ef1c","Type":"ContainerStarted","Data":"e7db7b39438668f9f764fc907971dca3c7caf432ac4cc0cd3aa310238c679c34"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.714282 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:28 crc kubenswrapper[4727]: E1001 12:39:28.716803 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:29.216782519 +0000 UTC m=+147.538137356 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.737318 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-sz95m" podStartSLOduration=123.737301488 podStartE2EDuration="2m3.737301488s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:28.737237406 +0000 UTC m=+147.058592253" watchObservedRunningTime="2025-10-01 12:39:28.737301488 +0000 UTC m=+147.058656315" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.739340 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6rldk" event={"ID":"f2187b19-c6d3-4d35-89e0-bf1124ab524f","Type":"ContainerStarted","Data":"94c650eb5810afdc4e855c18a26f4038917fe61089fd6dbbbb44bfd9ce812c91"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.741216 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-w2j4h" event={"ID":"393e430f-d192-4a64-a39b-fba4a1b1897e","Type":"ContainerStarted","Data":"1f6c825fa2a17ce6763cc2a917713f26887acc5f05d1cd330d4448052a898b9e"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.742898 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hls2x" event={"ID":"314cd705-8127-4d02-b9c2-d2c731733ec3","Type":"ContainerStarted","Data":"c107cb4ceb8f27795d222f9a4ae51c0db00f6b00386dc676b1eebb7d2c5d7a0e"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.744419 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-t4nz6" event={"ID":"286216d2-0a22-42ea-bbc7-40fbe51a6f98","Type":"ContainerStarted","Data":"8589e614470a33626189ea145d1ad341eb74a6cbac1d6e4d12d06261c8bb4267"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.760919 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tcfrz" event={"ID":"59f2fda4-ad42-4bd7-89c8-9f1a8c54123c","Type":"ContainerStarted","Data":"d595111f159456a13d20120511f66e6c1e14ea576c610e6f1db9a2b7130cf2ee"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.769675 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cfmpf" event={"ID":"38f722c5-5d96-4c10-a4da-724f25123439","Type":"ContainerStarted","Data":"43a85cb448d8491040c51e86cc1c25c835ff50ca75dbf0c87d69835275cfd66e"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.771808 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2447c" event={"ID":"e70b82d9-ed36-4c43-a868-e4d7f1b6ecd1","Type":"ContainerStarted","Data":"e714aea57b158349c7e1848e9e92ef83bde9ed83323ee642801cc4fe7745a410"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.773841 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nz7c2" event={"ID":"c61adaff-b097-44b1-b19e-daaf23012ac0","Type":"ContainerStarted","Data":"707b5450d209c98c70305349a85489b9f5c676d236463ec927c37ebffd168a17"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.790041 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-zjx2n" podStartSLOduration=124.790020222 podStartE2EDuration="2m4.790020222s" podCreationTimestamp="2025-10-01 12:37:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:28.786848964 +0000 UTC m=+147.108203801" watchObservedRunningTime="2025-10-01 12:39:28.790020222 +0000 UTC m=+147.111375059" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.792466 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lkgbs" event={"ID":"3a6fe7e2-cdb3-49f0-8697-60de951eff58","Type":"ContainerStarted","Data":"cf34d4604167e43838affb0655cf0b2bf8c2d71e04f070f1084827a1212e5c2a"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.816145 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:28 crc kubenswrapper[4727]: E1001 12:39:28.817538 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:29.317519718 +0000 UTC m=+147.638874555 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.820949 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hls2x" podStartSLOduration=123.820937255 podStartE2EDuration="2m3.820937255s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:28.820100476 +0000 UTC m=+147.141455313" watchObservedRunningTime="2025-10-01 12:39:28.820937255 +0000 UTC m=+147.142292092" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.826743 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-whs6s" event={"ID":"1eac8826-912b-4187-a3ce-3cbde72f1839","Type":"ContainerStarted","Data":"04babb879a45ab492b15a0c6599029ce3a9c19c84284f935a99319c3d81c0e58"} Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.828673 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h82nm" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.837075 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.857634 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.886462 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-w2j4h" podStartSLOduration=124.886443473 podStartE2EDuration="2m4.886443473s" podCreationTimestamp="2025-10-01 12:37:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:28.845217171 +0000 UTC m=+147.166572008" watchObservedRunningTime="2025-10-01 12:39:28.886443473 +0000 UTC m=+147.207798310" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.887796 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-whs6s" podStartSLOduration=123.88779162 podStartE2EDuration="2m3.88779162s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:28.884506238 +0000 UTC m=+147.205861085" watchObservedRunningTime="2025-10-01 12:39:28.88779162 +0000 UTC m=+147.209146457" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.888703 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h82nm" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.936507 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-vbhpd" Oct 01 12:39:28 crc kubenswrapper[4727]: I1001 12:39:28.937539 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:28 crc kubenswrapper[4727]: E1001 12:39:28.943895 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:29.443879218 +0000 UTC m=+147.765234055 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.046072 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.046116 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.049537 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:29 crc kubenswrapper[4727]: E1001 12:39:29.050225 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:29.550209278 +0000 UTC m=+147.871564115 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.073245 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.073637 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.113934 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.152369 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:29 crc kubenswrapper[4727]: E1001 12:39:29.152844 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:29.65282614 +0000 UTC m=+147.974180977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.257566 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:29 crc kubenswrapper[4727]: E1001 12:39:29.258170 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:29.758146804 +0000 UTC m=+148.079501641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.358881 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:29 crc kubenswrapper[4727]: E1001 12:39:29.359598 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:29.859578176 +0000 UTC m=+148.180933013 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.461025 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:29 crc kubenswrapper[4727]: E1001 12:39:29.461478 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:29.961448993 +0000 UTC m=+148.282803820 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.564485 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:29 crc kubenswrapper[4727]: E1001 12:39:29.565033 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:30.064986616 +0000 UTC m=+148.386341453 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.666222 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:29 crc kubenswrapper[4727]: E1001 12:39:29.666730 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:30.166660397 +0000 UTC m=+148.488015244 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.666915 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:29 crc kubenswrapper[4727]: E1001 12:39:29.667490 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:30.167463894 +0000 UTC m=+148.488818921 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.708070 4727 patch_prober.go:28] interesting pod/router-default-5444994796-8x58q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:39:29 crc kubenswrapper[4727]: [-]has-synced failed: reason withheld Oct 01 12:39:29 crc kubenswrapper[4727]: [+]process-running ok Oct 01 12:39:29 crc kubenswrapper[4727]: healthz check failed Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.708170 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8x58q" podUID="228aaa44-0de4-45e8-87d9-78ad4fa70f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.767904 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:29 crc kubenswrapper[4727]: E1001 12:39:29.768098 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:30.268066968 +0000 UTC m=+148.589421815 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.768284 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:29 crc kubenswrapper[4727]: E1001 12:39:29.768598 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:30.268590326 +0000 UTC m=+148.589945153 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.834480 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lkgbs" event={"ID":"3a6fe7e2-cdb3-49f0-8697-60de951eff58","Type":"ContainerStarted","Data":"c10848fd21c97f17d3bfce9f1b985d442144f83a5f375e2db8304a336dd8e4be"} Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.837280 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4gnf" event={"ID":"fe4ee3a0-3756-49f8-88f4-21bc1113845d","Type":"ContainerStarted","Data":"fb7f14e4d70e7e99c1a98e207fb7fc762a69f930e75c7fcd059258f7a3842870"} Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.839250 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cfmpf" event={"ID":"38f722c5-5d96-4c10-a4da-724f25123439","Type":"ContainerStarted","Data":"7f4966be40125151f2112a2b12fd0bfb888ae7ab3ff75ee5253f07b8257c7411"} Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.839331 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cfmpf" event={"ID":"38f722c5-5d96-4c10-a4da-724f25123439","Type":"ContainerStarted","Data":"44d53abad8da746676efc0345df73898820ee2d8e8329c3cbb87c5f9b2e00e7d"} Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.839494 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-cfmpf" Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.841201 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2447c" event={"ID":"e70b82d9-ed36-4c43-a868-e4d7f1b6ecd1","Type":"ContainerStarted","Data":"5802b9c024e165711231711dc048745501760492e32be4872c8641e488698e39"} Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.841251 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2447c" event={"ID":"e70b82d9-ed36-4c43-a868-e4d7f1b6ecd1","Type":"ContainerStarted","Data":"285d83e19635c9ead34dc7f4e67b01eac52c8b64289e354da7e9a0422a77ba11"} Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.853633 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nz7c2" event={"ID":"c61adaff-b097-44b1-b19e-daaf23012ac0","Type":"ContainerStarted","Data":"c355f5a289360df4abe19005eded119979de283fc980139d149746ed4b1b7743"} Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.853910 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nz7c2" Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.856246 4727 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-nz7c2 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.856297 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nz7c2" podUID="c61adaff-b097-44b1-b19e-daaf23012ac0" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.856845 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6h55" event={"ID":"ffda1e5d-0dc9-400a-97b2-e2f7e7773c04","Type":"ContainerStarted","Data":"622125bac140210e1a04088e91b14fa60b29a94baebd4f23c87d5b00c63e43ff"} Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.856881 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6h55" event={"ID":"ffda1e5d-0dc9-400a-97b2-e2f7e7773c04","Type":"ContainerStarted","Data":"920adfb20052a1d7627d593914fa9412f79ce1c23366990c3f2097521a0db7be"} Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.857010 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6h55" Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.868498 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-42q5h" event={"ID":"c9b6c239-689b-49b7-bdd1-0f3bdc7e53d6","Type":"ContainerStarted","Data":"298e8bc3c3e21db9147bf150e44098f1f6ca152e22ebda52e3b9f46390bd19ed"} Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.869034 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:29 crc kubenswrapper[4727]: E1001 12:39:29.869343 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:30.369320875 +0000 UTC m=+148.690675712 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.874427 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-whs6s" event={"ID":"1eac8826-912b-4187-a3ce-3cbde72f1839","Type":"ContainerStarted","Data":"8c2bd28cf7d6c1403f20a148d6998722cb8fd99b2e64de7e8a6d40a09ca3d9cb"} Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.892754 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mfjhb" event={"ID":"9d7314f2-4c1b-4b56-a55a-cf5c4b153c71","Type":"ContainerStarted","Data":"48987bcdf3255b9de15669609fcc1868eecf9288149ed40954c70fab38eb8274"} Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.893089 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mfjhb" event={"ID":"9d7314f2-4c1b-4b56-a55a-cf5c4b153c71","Type":"ContainerStarted","Data":"156a261074ddd202edeff979d183387f0135d132acb3d16238fd6ddbac47004e"} Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.905621 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-84klz" event={"ID":"aad21b5e-c192-46d8-9cbe-516f5dc5def2","Type":"ContainerStarted","Data":"707d5d0f01339c7a70cf1faa5a926979d02608cffcc452f19ef08f866cc5e069"} Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.905871 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-84klz" Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.907960 4727 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-84klz container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" start-of-body= Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.908360 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-84klz" podUID="aad21b5e-c192-46d8-9cbe-516f5dc5def2" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.910062 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lpgxf" event={"ID":"784aadbb-0b96-4110-8ca3-7c38ca2456e4","Type":"ContainerStarted","Data":"266fdd195d604769c3bf0be2cc74d5e56e7b8218fdddb344907f515d3a8f1c75"} Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.910144 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lpgxf" event={"ID":"784aadbb-0b96-4110-8ca3-7c38ca2456e4","Type":"ContainerStarted","Data":"39118e598a9e2a64554ef8d1ee241480d20d51737589894b408c1c48c27cf00e"} Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.913807 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6rldk" event={"ID":"f2187b19-c6d3-4d35-89e0-bf1124ab524f","Type":"ContainerStarted","Data":"0a98e248a519312588f8376437b04a870aafad54308a3f5d1f67e6915e9f25f0"} Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.929642 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xkkcc" event={"ID":"d9815009-494f-4e87-9d55-da93dc61b078","Type":"ContainerStarted","Data":"ee05110df687b3de7a89448765928515a45a4a4006c6517cfcc2ca98afab4e54"} Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.945936 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qhnkw" event={"ID":"172bc9e3-a420-4a31-a309-98c533dfdb4f","Type":"ContainerStarted","Data":"84cbada3783ea15f3429231a8e50f1bed1133392323c3aaa0415ad44a69a3925"} Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.953109 4727 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-sz95m container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.953168 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-sz95m" podUID="ea389964-1da2-4ade-8772-b8bd1a76cc27" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.958823 4727 patch_prober.go:28] interesting pod/downloads-7954f5f757-qktm7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.958888 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qktm7" podUID="61ea519c-4d97-4e3e-b932-51a3f8e2e07f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.961068 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lkgbs" podStartSLOduration=124.961053167 podStartE2EDuration="2m4.961053167s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:29.960450846 +0000 UTC m=+148.281805693" watchObservedRunningTime="2025-10-01 12:39:29.961053167 +0000 UTC m=+148.282407994" Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.966557 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.967707 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nhtfk" Oct 01 12:39:29 crc kubenswrapper[4727]: I1001 12:39:29.972040 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:29 crc kubenswrapper[4727]: E1001 12:39:29.974204 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:30.474189784 +0000 UTC m=+148.795544621 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.075850 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:30 crc kubenswrapper[4727]: E1001 12:39:30.077125 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:30.577101656 +0000 UTC m=+148.898456493 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.106612 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6h55" podStartSLOduration=125.10658407 podStartE2EDuration="2m5.10658407s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:30.100217193 +0000 UTC m=+148.421572040" watchObservedRunningTime="2025-10-01 12:39:30.10658407 +0000 UTC m=+148.427938907" Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.157938 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-mfjhb" podStartSLOduration=125.157912336 podStartE2EDuration="2m5.157912336s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:30.153581228 +0000 UTC m=+148.474936085" watchObservedRunningTime="2025-10-01 12:39:30.157912336 +0000 UTC m=+148.479267173" Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.184812 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:30 crc kubenswrapper[4727]: E1001 12:39:30.185466 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:30.685448344 +0000 UTC m=+149.006803171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.230595 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-cfmpf" podStartSLOduration=8.230564498 podStartE2EDuration="8.230564498s" podCreationTimestamp="2025-10-01 12:39:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:30.187581546 +0000 UTC m=+148.508936383" watchObservedRunningTime="2025-10-01 12:39:30.230564498 +0000 UTC m=+148.551919335" Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.263501 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lpgxf" podStartSLOduration=125.263482439 podStartE2EDuration="2m5.263482439s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:30.233443156 +0000 UTC m=+148.554797993" watchObservedRunningTime="2025-10-01 12:39:30.263482439 +0000 UTC m=+148.584837276" Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.287164 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:30 crc kubenswrapper[4727]: E1001 12:39:30.287752 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:30.787731574 +0000 UTC m=+149.109086411 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.316349 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nz7c2" podStartSLOduration=125.316329338 podStartE2EDuration="2m5.316329338s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:30.265676373 +0000 UTC m=+148.587031200" watchObservedRunningTime="2025-10-01 12:39:30.316329338 +0000 UTC m=+148.637684175" Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.316869 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-84klz" podStartSLOduration=125.316865066 podStartE2EDuration="2m5.316865066s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:30.316535445 +0000 UTC m=+148.637890292" watchObservedRunningTime="2025-10-01 12:39:30.316865066 +0000 UTC m=+148.638219903" Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.382085 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-qhnkw" podStartSLOduration=125.382069975 podStartE2EDuration="2m5.382069975s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:30.381472935 +0000 UTC m=+148.702827772" watchObservedRunningTime="2025-10-01 12:39:30.382069975 +0000 UTC m=+148.703424812" Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.389688 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:30 crc kubenswrapper[4727]: E1001 12:39:30.390091 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:30.890079098 +0000 UTC m=+149.211433935 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.458487 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-42q5h" podStartSLOduration=8.458465765 podStartE2EDuration="8.458465765s" podCreationTimestamp="2025-10-01 12:39:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:30.423780305 +0000 UTC m=+148.745135132" watchObservedRunningTime="2025-10-01 12:39:30.458465765 +0000 UTC m=+148.779820602" Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.490795 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:30 crc kubenswrapper[4727]: E1001 12:39:30.491062 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:30.990984242 +0000 UTC m=+149.312339079 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.491329 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:30 crc kubenswrapper[4727]: E1001 12:39:30.491609 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:30.991597153 +0000 UTC m=+149.312951990 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.509723 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2447c" podStartSLOduration=125.509708289 podStartE2EDuration="2m5.509708289s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:30.459348845 +0000 UTC m=+148.780703682" watchObservedRunningTime="2025-10-01 12:39:30.509708289 +0000 UTC m=+148.831063126" Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.593332 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:30 crc kubenswrapper[4727]: E1001 12:39:30.593737 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:31.093723578 +0000 UTC m=+149.415078415 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.694456 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:30 crc kubenswrapper[4727]: E1001 12:39:30.694812 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:31.194798748 +0000 UTC m=+149.516153585 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.699505 4727 patch_prober.go:28] interesting pod/router-default-5444994796-8x58q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:39:30 crc kubenswrapper[4727]: [-]has-synced failed: reason withheld Oct 01 12:39:30 crc kubenswrapper[4727]: [+]process-running ok Oct 01 12:39:30 crc kubenswrapper[4727]: healthz check failed Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.699566 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8x58q" podUID="228aaa44-0de4-45e8-87d9-78ad4fa70f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.795227 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:30 crc kubenswrapper[4727]: E1001 12:39:30.795566 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:31.295551588 +0000 UTC m=+149.616906415 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.896541 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:30 crc kubenswrapper[4727]: E1001 12:39:30.896867 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:31.396853345 +0000 UTC m=+149.718208182 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.954255 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tcfrz" event={"ID":"59f2fda4-ad42-4bd7-89c8-9f1a8c54123c","Type":"ContainerStarted","Data":"8f87b88c75a962364305763d00a796c0903dea21a00df8b5ccddbbbd823d1944"} Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.956446 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6rldk" event={"ID":"f2187b19-c6d3-4d35-89e0-bf1124ab524f","Type":"ContainerStarted","Data":"e51ae1a101e41c91a1135e924555e916b2c1279d8e11aa87106d807f7f32bbab"} Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.957249 4727 patch_prober.go:28] interesting pod/downloads-7954f5f757-qktm7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.957289 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qktm7" podUID="61ea519c-4d97-4e3e-b932-51a3f8e2e07f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.957635 4727 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-sz95m container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.957700 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-sz95m" podUID="ea389964-1da2-4ade-8772-b8bd1a76cc27" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.964498 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nz7c2" Oct 01 12:39:30 crc kubenswrapper[4727]: I1001 12:39:30.997728 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:30 crc kubenswrapper[4727]: E1001 12:39:30.998430 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:31.498407521 +0000 UTC m=+149.819762368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.101191 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:31 crc kubenswrapper[4727]: E1001 12:39:31.101668 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:31.601651715 +0000 UTC m=+149.923006542 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.130127 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-6rldk" podStartSLOduration=126.130105113 podStartE2EDuration="2m6.130105113s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:31.035772603 +0000 UTC m=+149.357127450" watchObservedRunningTime="2025-10-01 12:39:31.130105113 +0000 UTC m=+149.451459970" Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.211937 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:31 crc kubenswrapper[4727]: E1001 12:39:31.212675 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:31.712642602 +0000 UTC m=+150.033997449 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.315024 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.315086 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.315136 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.315168 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.315237 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:31 crc kubenswrapper[4727]: E1001 12:39:31.318435 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:31.818417552 +0000 UTC m=+150.139772389 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.323503 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.349079 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.350173 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.354809 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.398421 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.411449 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.416695 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:31 crc kubenswrapper[4727]: E1001 12:39:31.417268 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:31.917247195 +0000 UTC m=+150.238602032 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.520393 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:31 crc kubenswrapper[4727]: E1001 12:39:31.520860 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:32.020843691 +0000 UTC m=+150.342198528 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.612235 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.621420 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:31 crc kubenswrapper[4727]: E1001 12:39:31.621942 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:32.121917131 +0000 UTC m=+150.443271968 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.688208 4727 patch_prober.go:28] interesting pod/apiserver-76f77b778f-7c8v7 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 01 12:39:31 crc kubenswrapper[4727]: [+]log ok Oct 01 12:39:31 crc kubenswrapper[4727]: [+]etcd ok Oct 01 12:39:31 crc kubenswrapper[4727]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 01 12:39:31 crc kubenswrapper[4727]: [+]poststarthook/generic-apiserver-start-informers ok Oct 01 12:39:31 crc kubenswrapper[4727]: [+]poststarthook/max-in-flight-filter ok Oct 01 12:39:31 crc kubenswrapper[4727]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 01 12:39:31 crc kubenswrapper[4727]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 01 12:39:31 crc kubenswrapper[4727]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 01 12:39:31 crc kubenswrapper[4727]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 01 12:39:31 crc kubenswrapper[4727]: [+]poststarthook/project.openshift.io-projectcache ok Oct 01 12:39:31 crc kubenswrapper[4727]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 01 12:39:31 crc kubenswrapper[4727]: [+]poststarthook/openshift.io-startinformers ok Oct 01 12:39:31 crc kubenswrapper[4727]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 01 12:39:31 crc kubenswrapper[4727]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 01 12:39:31 crc kubenswrapper[4727]: livez check failed Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.688306 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" podUID="2ca098f7-bd42-4bfa-a5db-dcdb9d65e31d" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.706274 4727 patch_prober.go:28] interesting pod/router-default-5444994796-8x58q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:39:31 crc kubenswrapper[4727]: [-]has-synced failed: reason withheld Oct 01 12:39:31 crc kubenswrapper[4727]: [+]process-running ok Oct 01 12:39:31 crc kubenswrapper[4727]: healthz check failed Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.706364 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8x58q" podUID="228aaa44-0de4-45e8-87d9-78ad4fa70f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.713965 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.714966 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.718426 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.718686 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.725707 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:31 crc kubenswrapper[4727]: E1001 12:39:31.726143 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:32.226127658 +0000 UTC m=+150.547482505 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.742193 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.828418 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.828633 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.828725 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:39:31 crc kubenswrapper[4727]: E1001 12:39:31.828816 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:32.328800833 +0000 UTC m=+150.650155670 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.929947 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.930018 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.930085 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:31 crc kubenswrapper[4727]: E1001 12:39:31.930510 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:32.430493844 +0000 UTC m=+150.751848681 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.931055 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.961095 4727 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-84klz container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.961169 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-84klz" podUID="aad21b5e-c192-46d8-9cbe-516f5dc5def2" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 01 12:39:31 crc kubenswrapper[4727]: I1001 12:39:31.968906 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:31.998497 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tcfrz" event={"ID":"59f2fda4-ad42-4bd7-89c8-9f1a8c54123c","Type":"ContainerStarted","Data":"a4a5d4836c889411875fcc4d2448140c5ed68a6d049dbdbd1a5d434de95b6bfe"} Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.032917 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:32 crc kubenswrapper[4727]: E1001 12:39:32.033558 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:32.53354338 +0000 UTC m=+150.854898217 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.055380 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.140716 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:32 crc kubenswrapper[4727]: E1001 12:39:32.143640 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:32.643625797 +0000 UTC m=+150.964980644 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.245732 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:32 crc kubenswrapper[4727]: E1001 12:39:32.245827 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:32.745803235 +0000 UTC m=+151.067158072 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.245957 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:32 crc kubenswrapper[4727]: E1001 12:39:32.246259 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:32.746245699 +0000 UTC m=+151.067600536 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:32 crc kubenswrapper[4727]: W1001 12:39:32.333552 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-a1b231a4f1801b4b3cd531f95cd5140bd9184a7d3581fd5d32b0137d02c62d73 WatchSource:0}: Error finding container a1b231a4f1801b4b3cd531f95cd5140bd9184a7d3581fd5d32b0137d02c62d73: Status 404 returned error can't find the container with id a1b231a4f1801b4b3cd531f95cd5140bd9184a7d3581fd5d32b0137d02c62d73 Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.349429 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:32 crc kubenswrapper[4727]: E1001 12:39:32.350133 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:32.850114504 +0000 UTC m=+151.171469341 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.452856 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:32 crc kubenswrapper[4727]: E1001 12:39:32.453188 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:32.953177502 +0000 UTC m=+151.274532339 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.554827 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:32 crc kubenswrapper[4727]: E1001 12:39:32.555526 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:33.055493925 +0000 UTC m=+151.376848762 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.657481 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:32 crc kubenswrapper[4727]: E1001 12:39:32.658131 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:33.158108297 +0000 UTC m=+151.479463134 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.661127 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.705802 4727 patch_prober.go:28] interesting pod/router-default-5444994796-8x58q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:39:32 crc kubenswrapper[4727]: [-]has-synced failed: reason withheld Oct 01 12:39:32 crc kubenswrapper[4727]: [+]process-running ok Oct 01 12:39:32 crc kubenswrapper[4727]: healthz check failed Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.705889 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8x58q" podUID="228aaa44-0de4-45e8-87d9-78ad4fa70f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.720661 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8l85n"] Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.721975 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8l85n" Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.725787 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.749509 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8l85n"] Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.759414 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:32 crc kubenswrapper[4727]: E1001 12:39:32.759993 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:33.259967543 +0000 UTC m=+151.581322380 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.860689 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8548d350-ee32-44e6-85d2-2e30036d5eb8-utilities\") pod \"community-operators-8l85n\" (UID: \"8548d350-ee32-44e6-85d2-2e30036d5eb8\") " pod="openshift-marketplace/community-operators-8l85n" Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.860742 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4l2d\" (UniqueName: \"kubernetes.io/projected/8548d350-ee32-44e6-85d2-2e30036d5eb8-kube-api-access-g4l2d\") pod \"community-operators-8l85n\" (UID: \"8548d350-ee32-44e6-85d2-2e30036d5eb8\") " pod="openshift-marketplace/community-operators-8l85n" Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.860777 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.860815 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8548d350-ee32-44e6-85d2-2e30036d5eb8-catalog-content\") pod \"community-operators-8l85n\" (UID: \"8548d350-ee32-44e6-85d2-2e30036d5eb8\") " pod="openshift-marketplace/community-operators-8l85n" Oct 01 12:39:32 crc kubenswrapper[4727]: E1001 12:39:32.861129 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:33.361116236 +0000 UTC m=+151.682471073 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.907896 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hj5p4"] Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.908845 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hj5p4" Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.916451 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.938035 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hj5p4"] Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.962228 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.962624 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8548d350-ee32-44e6-85d2-2e30036d5eb8-catalog-content\") pod \"community-operators-8l85n\" (UID: \"8548d350-ee32-44e6-85d2-2e30036d5eb8\") " pod="openshift-marketplace/community-operators-8l85n" Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.962685 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8548d350-ee32-44e6-85d2-2e30036d5eb8-utilities\") pod \"community-operators-8l85n\" (UID: \"8548d350-ee32-44e6-85d2-2e30036d5eb8\") " pod="openshift-marketplace/community-operators-8l85n" Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.962714 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4l2d\" (UniqueName: \"kubernetes.io/projected/8548d350-ee32-44e6-85d2-2e30036d5eb8-kube-api-access-g4l2d\") pod \"community-operators-8l85n\" (UID: \"8548d350-ee32-44e6-85d2-2e30036d5eb8\") " pod="openshift-marketplace/community-operators-8l85n" Oct 01 12:39:32 crc kubenswrapper[4727]: E1001 12:39:32.963111 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:33.463070335 +0000 UTC m=+151.784425172 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.963546 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8548d350-ee32-44e6-85d2-2e30036d5eb8-catalog-content\") pod \"community-operators-8l85n\" (UID: \"8548d350-ee32-44e6-85d2-2e30036d5eb8\") " pod="openshift-marketplace/community-operators-8l85n" Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.963766 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8548d350-ee32-44e6-85d2-2e30036d5eb8-utilities\") pod \"community-operators-8l85n\" (UID: \"8548d350-ee32-44e6-85d2-2e30036d5eb8\") " pod="openshift-marketplace/community-operators-8l85n" Oct 01 12:39:32 crc kubenswrapper[4727]: I1001 12:39:32.996644 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4l2d\" (UniqueName: \"kubernetes.io/projected/8548d350-ee32-44e6-85d2-2e30036d5eb8-kube-api-access-g4l2d\") pod \"community-operators-8l85n\" (UID: \"8548d350-ee32-44e6-85d2-2e30036d5eb8\") " pod="openshift-marketplace/community-operators-8l85n" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.041720 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"096e97c5f6ee376b860dbb0ee080547b3d466c5599e2cd4280d202e94ed35e60"} Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.041779 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"379b10738c7fe066ad511d85997241a75541e99df5203dd6dc8082af1f69e47c"} Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.059957 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tcfrz" event={"ID":"59f2fda4-ad42-4bd7-89c8-9f1a8c54123c","Type":"ContainerStarted","Data":"ed1687d394355cc353bca5af7c051b43dc0e687ed9823b80a858d476b2882f0e"} Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.065717 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a39814-1723-46e2-b468-67e6cf668788-catalog-content\") pod \"certified-operators-hj5p4\" (UID: \"c9a39814-1723-46e2-b468-67e6cf668788\") " pod="openshift-marketplace/certified-operators-hj5p4" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.065814 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.065858 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4h42\" (UniqueName: \"kubernetes.io/projected/c9a39814-1723-46e2-b468-67e6cf668788-kube-api-access-r4h42\") pod \"certified-operators-hj5p4\" (UID: \"c9a39814-1723-46e2-b468-67e6cf668788\") " pod="openshift-marketplace/certified-operators-hj5p4" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.065898 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a39814-1723-46e2-b468-67e6cf668788-utilities\") pod \"certified-operators-hj5p4\" (UID: \"c9a39814-1723-46e2-b468-67e6cf668788\") " pod="openshift-marketplace/certified-operators-hj5p4" Oct 01 12:39:33 crc kubenswrapper[4727]: E1001 12:39:33.066282 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:33.566265158 +0000 UTC m=+151.887620005 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.073107 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c5a4b2408d159b64c18babd08e702b3f8b042e4ea7cb29256149d5883a74a25d"} Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.073337 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.074737 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8","Type":"ContainerStarted","Data":"0ba1b5170eda5a50c8e406e020dd250c4817bd8c6159d13b5736f9cf4df8f0ea"} Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.078497 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a1b231a4f1801b4b3cd531f95cd5140bd9184a7d3581fd5d32b0137d02c62d73"} Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.098159 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lrd89"] Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.099839 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrd89" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.112541 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lrd89"] Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.127893 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8l85n" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.167235 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.167437 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a39814-1723-46e2-b468-67e6cf668788-catalog-content\") pod \"certified-operators-hj5p4\" (UID: \"c9a39814-1723-46e2-b468-67e6cf668788\") " pod="openshift-marketplace/certified-operators-hj5p4" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.167515 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4h42\" (UniqueName: \"kubernetes.io/projected/c9a39814-1723-46e2-b468-67e6cf668788-kube-api-access-r4h42\") pod \"certified-operators-hj5p4\" (UID: \"c9a39814-1723-46e2-b468-67e6cf668788\") " pod="openshift-marketplace/certified-operators-hj5p4" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.167551 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a39814-1723-46e2-b468-67e6cf668788-utilities\") pod \"certified-operators-hj5p4\" (UID: \"c9a39814-1723-46e2-b468-67e6cf668788\") " pod="openshift-marketplace/certified-operators-hj5p4" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.167932 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a39814-1723-46e2-b468-67e6cf668788-utilities\") pod \"certified-operators-hj5p4\" (UID: \"c9a39814-1723-46e2-b468-67e6cf668788\") " pod="openshift-marketplace/certified-operators-hj5p4" Oct 01 12:39:33 crc kubenswrapper[4727]: E1001 12:39:33.168029 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:33.66801399 +0000 UTC m=+151.989368817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.168226 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a39814-1723-46e2-b468-67e6cf668788-catalog-content\") pod \"certified-operators-hj5p4\" (UID: \"c9a39814-1723-46e2-b468-67e6cf668788\") " pod="openshift-marketplace/certified-operators-hj5p4" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.177367 4727 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.191302 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4h42\" (UniqueName: \"kubernetes.io/projected/c9a39814-1723-46e2-b468-67e6cf668788-kube-api-access-r4h42\") pod \"certified-operators-hj5p4\" (UID: \"c9a39814-1723-46e2-b468-67e6cf668788\") " pod="openshift-marketplace/certified-operators-hj5p4" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.241325 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hj5p4" Oct 01 12:39:33 crc kubenswrapper[4727]: E1001 12:39:33.276362 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:33.776345607 +0000 UTC m=+152.097700444 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.276534 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.276574 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkk2g\" (UniqueName: \"kubernetes.io/projected/c3beff22-e67e-4639-9562-3663809167d7-kube-api-access-fkk2g\") pod \"community-operators-lrd89\" (UID: \"c3beff22-e67e-4639-9562-3663809167d7\") " pod="openshift-marketplace/community-operators-lrd89" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.276594 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3beff22-e67e-4639-9562-3663809167d7-utilities\") pod \"community-operators-lrd89\" (UID: \"c3beff22-e67e-4639-9562-3663809167d7\") " pod="openshift-marketplace/community-operators-lrd89" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.276623 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3beff22-e67e-4639-9562-3663809167d7-catalog-content\") pod \"community-operators-lrd89\" (UID: \"c3beff22-e67e-4639-9562-3663809167d7\") " pod="openshift-marketplace/community-operators-lrd89" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.291818 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.291891 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.306199 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kfxlj"] Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.309557 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kfxlj" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.337042 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kfxlj"] Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.378901 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:33 crc kubenswrapper[4727]: E1001 12:39:33.379521 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:33.879468717 +0000 UTC m=+152.200823554 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.379890 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.379944 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkk2g\" (UniqueName: \"kubernetes.io/projected/c3beff22-e67e-4639-9562-3663809167d7-kube-api-access-fkk2g\") pod \"community-operators-lrd89\" (UID: \"c3beff22-e67e-4639-9562-3663809167d7\") " pod="openshift-marketplace/community-operators-lrd89" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.379971 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3beff22-e67e-4639-9562-3663809167d7-utilities\") pod \"community-operators-lrd89\" (UID: \"c3beff22-e67e-4639-9562-3663809167d7\") " pod="openshift-marketplace/community-operators-lrd89" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.380042 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3beff22-e67e-4639-9562-3663809167d7-catalog-content\") pod \"community-operators-lrd89\" (UID: \"c3beff22-e67e-4639-9562-3663809167d7\") " pod="openshift-marketplace/community-operators-lrd89" Oct 01 12:39:33 crc kubenswrapper[4727]: E1001 12:39:33.387686 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:33.887663646 +0000 UTC m=+152.209018483 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.388871 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3beff22-e67e-4639-9562-3663809167d7-utilities\") pod \"community-operators-lrd89\" (UID: \"c3beff22-e67e-4639-9562-3663809167d7\") " pod="openshift-marketplace/community-operators-lrd89" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.389101 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3beff22-e67e-4639-9562-3663809167d7-catalog-content\") pod \"community-operators-lrd89\" (UID: \"c3beff22-e67e-4639-9562-3663809167d7\") " pod="openshift-marketplace/community-operators-lrd89" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.475638 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkk2g\" (UniqueName: \"kubernetes.io/projected/c3beff22-e67e-4639-9562-3663809167d7-kube-api-access-fkk2g\") pod \"community-operators-lrd89\" (UID: \"c3beff22-e67e-4639-9562-3663809167d7\") " pod="openshift-marketplace/community-operators-lrd89" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.481019 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:33 crc kubenswrapper[4727]: E1001 12:39:33.481192 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 12:39:33.981159958 +0000 UTC m=+152.302514795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.481245 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr9tw\" (UniqueName: \"kubernetes.io/projected/e4d11880-82d5-49b0-965a-e2fc54b9c775-kube-api-access-qr9tw\") pod \"certified-operators-kfxlj\" (UID: \"e4d11880-82d5-49b0-965a-e2fc54b9c775\") " pod="openshift-marketplace/certified-operators-kfxlj" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.481274 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4d11880-82d5-49b0-965a-e2fc54b9c775-utilities\") pod \"certified-operators-kfxlj\" (UID: \"e4d11880-82d5-49b0-965a-e2fc54b9c775\") " pod="openshift-marketplace/certified-operators-kfxlj" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.481299 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.481319 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4d11880-82d5-49b0-965a-e2fc54b9c775-catalog-content\") pod \"certified-operators-kfxlj\" (UID: \"e4d11880-82d5-49b0-965a-e2fc54b9c775\") " pod="openshift-marketplace/certified-operators-kfxlj" Oct 01 12:39:33 crc kubenswrapper[4727]: E1001 12:39:33.481743 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 12:39:33.981731747 +0000 UTC m=+152.303086584 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8lp9x" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.562943 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8l85n"] Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.569914 4727 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-01T12:39:33.17740053Z","Handler":null,"Name":""} Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.574426 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hj5p4"] Oct 01 12:39:33 crc kubenswrapper[4727]: W1001 12:39:33.579037 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8548d350_ee32_44e6_85d2_2e30036d5eb8.slice/crio-8b38457f456bd25dc8a3c39e393d06606a6233a1ba675701a88c901d14d413da WatchSource:0}: Error finding container 8b38457f456bd25dc8a3c39e393d06606a6233a1ba675701a88c901d14d413da: Status 404 returned error can't find the container with id 8b38457f456bd25dc8a3c39e393d06606a6233a1ba675701a88c901d14d413da Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.582360 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.582540 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4d11880-82d5-49b0-965a-e2fc54b9c775-utilities\") pod \"certified-operators-kfxlj\" (UID: \"e4d11880-82d5-49b0-965a-e2fc54b9c775\") " pod="openshift-marketplace/certified-operators-kfxlj" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.582597 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4d11880-82d5-49b0-965a-e2fc54b9c775-catalog-content\") pod \"certified-operators-kfxlj\" (UID: \"e4d11880-82d5-49b0-965a-e2fc54b9c775\") " pod="openshift-marketplace/certified-operators-kfxlj" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.582679 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr9tw\" (UniqueName: \"kubernetes.io/projected/e4d11880-82d5-49b0-965a-e2fc54b9c775-kube-api-access-qr9tw\") pod \"certified-operators-kfxlj\" (UID: \"e4d11880-82d5-49b0-965a-e2fc54b9c775\") " pod="openshift-marketplace/certified-operators-kfxlj" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.582858 4727 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.582913 4727 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.583497 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4d11880-82d5-49b0-965a-e2fc54b9c775-utilities\") pod \"certified-operators-kfxlj\" (UID: \"e4d11880-82d5-49b0-965a-e2fc54b9c775\") " pod="openshift-marketplace/certified-operators-kfxlj" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.583730 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4d11880-82d5-49b0-965a-e2fc54b9c775-catalog-content\") pod \"certified-operators-kfxlj\" (UID: \"e4d11880-82d5-49b0-965a-e2fc54b9c775\") " pod="openshift-marketplace/certified-operators-kfxlj" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.589073 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.602361 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr9tw\" (UniqueName: \"kubernetes.io/projected/e4d11880-82d5-49b0-965a-e2fc54b9c775-kube-api-access-qr9tw\") pod \"certified-operators-kfxlj\" (UID: \"e4d11880-82d5-49b0-965a-e2fc54b9c775\") " pod="openshift-marketplace/certified-operators-kfxlj" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.684364 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.688247 4727 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.688588 4727 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.701319 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kfxlj" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.705783 4727 patch_prober.go:28] interesting pod/router-default-5444994796-8x58q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:39:33 crc kubenswrapper[4727]: [-]has-synced failed: reason withheld Oct 01 12:39:33 crc kubenswrapper[4727]: [+]process-running ok Oct 01 12:39:33 crc kubenswrapper[4727]: healthz check failed Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.705858 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8x58q" podUID="228aaa44-0de4-45e8-87d9-78ad4fa70f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.721701 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8lp9x\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.722142 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrd89" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.880732 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d4gnf" Oct 01 12:39:33 crc kubenswrapper[4727]: I1001 12:39:33.917741 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.049361 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.057150 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-7c8v7" Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.093333 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"30248830c6db67eea37ebb9f2aac2ca243fe49167ae2c7db1fa263f96e15d20f"} Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.096391 4727 generic.go:334] "Generic (PLEG): container finished" podID="8548d350-ee32-44e6-85d2-2e30036d5eb8" containerID="afc8ccf487c14a4f9edf55263037becb617cf50031103899fb6f4e760874f06a" exitCode=0 Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.096463 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8l85n" event={"ID":"8548d350-ee32-44e6-85d2-2e30036d5eb8","Type":"ContainerDied","Data":"afc8ccf487c14a4f9edf55263037becb617cf50031103899fb6f4e760874f06a"} Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.096492 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8l85n" event={"ID":"8548d350-ee32-44e6-85d2-2e30036d5eb8","Type":"ContainerStarted","Data":"8b38457f456bd25dc8a3c39e393d06606a6233a1ba675701a88c901d14d413da"} Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.098111 4727 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.103474 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tcfrz" event={"ID":"59f2fda4-ad42-4bd7-89c8-9f1a8c54123c","Type":"ContainerStarted","Data":"4d87818510e798898717749fc0275d6cbecf7a408c6124fe4ba6177f12c18671"} Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.121613 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"1a7b3f69f1dffa818d5bbb0a014cfd3858c5dbd2cf6f41b139e621a6195c0bcc"} Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.143814 4727 generic.go:334] "Generic (PLEG): container finished" podID="86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8" containerID="0ef35fa2aea7c8527f900c28cebd09479f48e0e8a89894721ff678c898d15462" exitCode=0 Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.143906 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8","Type":"ContainerDied","Data":"0ef35fa2aea7c8527f900c28cebd09479f48e0e8a89894721ff678c898d15462"} Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.171592 4727 generic.go:334] "Generic (PLEG): container finished" podID="c9a39814-1723-46e2-b468-67e6cf668788" containerID="0dec39fec16013a8387e65a723203929d58d7ea1fcf8e53c988ad0b25b23b35b" exitCode=0 Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.171760 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hj5p4" event={"ID":"c9a39814-1723-46e2-b468-67e6cf668788","Type":"ContainerDied","Data":"0dec39fec16013a8387e65a723203929d58d7ea1fcf8e53c988ad0b25b23b35b"} Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.171795 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hj5p4" event={"ID":"c9a39814-1723-46e2-b468-67e6cf668788","Type":"ContainerStarted","Data":"ada921ffbf3889ad27d9dd2e578a5944c8216ef297af7cbfcaad23a18979837b"} Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.242059 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-tcfrz" podStartSLOduration=12.242020363 podStartE2EDuration="12.242020363s" podCreationTimestamp="2025-10-01 12:39:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:34.239612141 +0000 UTC m=+152.560966988" watchObservedRunningTime="2025-10-01 12:39:34.242020363 +0000 UTC m=+152.563375200" Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.297457 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lrd89"] Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.310795 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kfxlj"] Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.365882 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8lp9x"] Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.431325 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.698661 4727 patch_prober.go:28] interesting pod/router-default-5444994796-8x58q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:39:34 crc kubenswrapper[4727]: [-]has-synced failed: reason withheld Oct 01 12:39:34 crc kubenswrapper[4727]: [+]process-running ok Oct 01 12:39:34 crc kubenswrapper[4727]: healthz check failed Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.698764 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8x58q" podUID="228aaa44-0de4-45e8-87d9-78ad4fa70f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.883751 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.883822 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.886438 4727 patch_prober.go:28] interesting pod/console-f9d7485db-89tz6 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.886543 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-89tz6" podUID="366b7e92-ea45-4052-8ddc-9540d534a7ad" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.893135 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kws6b"] Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.894603 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kws6b" Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.896706 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 01 12:39:34 crc kubenswrapper[4727]: I1001 12:39:34.906753 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kws6b"] Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.010154 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/defd3d6f-dd53-4725-af25-c711790c4870-utilities\") pod \"redhat-marketplace-kws6b\" (UID: \"defd3d6f-dd53-4725-af25-c711790c4870\") " pod="openshift-marketplace/redhat-marketplace-kws6b" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.010210 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/defd3d6f-dd53-4725-af25-c711790c4870-catalog-content\") pod \"redhat-marketplace-kws6b\" (UID: \"defd3d6f-dd53-4725-af25-c711790c4870\") " pod="openshift-marketplace/redhat-marketplace-kws6b" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.010325 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnnmb\" (UniqueName: \"kubernetes.io/projected/defd3d6f-dd53-4725-af25-c711790c4870-kube-api-access-xnnmb\") pod \"redhat-marketplace-kws6b\" (UID: \"defd3d6f-dd53-4725-af25-c711790c4870\") " pod="openshift-marketplace/redhat-marketplace-kws6b" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.111698 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/defd3d6f-dd53-4725-af25-c711790c4870-utilities\") pod \"redhat-marketplace-kws6b\" (UID: \"defd3d6f-dd53-4725-af25-c711790c4870\") " pod="openshift-marketplace/redhat-marketplace-kws6b" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.112336 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/defd3d6f-dd53-4725-af25-c711790c4870-catalog-content\") pod \"redhat-marketplace-kws6b\" (UID: \"defd3d6f-dd53-4725-af25-c711790c4870\") " pod="openshift-marketplace/redhat-marketplace-kws6b" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.112261 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/defd3d6f-dd53-4725-af25-c711790c4870-utilities\") pod \"redhat-marketplace-kws6b\" (UID: \"defd3d6f-dd53-4725-af25-c711790c4870\") " pod="openshift-marketplace/redhat-marketplace-kws6b" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.112838 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/defd3d6f-dd53-4725-af25-c711790c4870-catalog-content\") pod \"redhat-marketplace-kws6b\" (UID: \"defd3d6f-dd53-4725-af25-c711790c4870\") " pod="openshift-marketplace/redhat-marketplace-kws6b" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.112909 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnnmb\" (UniqueName: \"kubernetes.io/projected/defd3d6f-dd53-4725-af25-c711790c4870-kube-api-access-xnnmb\") pod \"redhat-marketplace-kws6b\" (UID: \"defd3d6f-dd53-4725-af25-c711790c4870\") " pod="openshift-marketplace/redhat-marketplace-kws6b" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.138470 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnnmb\" (UniqueName: \"kubernetes.io/projected/defd3d6f-dd53-4725-af25-c711790c4870-kube-api-access-xnnmb\") pod \"redhat-marketplace-kws6b\" (UID: \"defd3d6f-dd53-4725-af25-c711790c4870\") " pod="openshift-marketplace/redhat-marketplace-kws6b" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.179417 4727 generic.go:334] "Generic (PLEG): container finished" podID="c3beff22-e67e-4639-9562-3663809167d7" containerID="2013368ee9db53dd6a3f4b96626c4f5e57218941c5d10fbf3dc1c4ff55852f98" exitCode=0 Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.179485 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrd89" event={"ID":"c3beff22-e67e-4639-9562-3663809167d7","Type":"ContainerDied","Data":"2013368ee9db53dd6a3f4b96626c4f5e57218941c5d10fbf3dc1c4ff55852f98"} Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.179557 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrd89" event={"ID":"c3beff22-e67e-4639-9562-3663809167d7","Type":"ContainerStarted","Data":"46cc483d66fd1c6f924b08a4e7f44f6697541002f26871353acc0f96df15c4f4"} Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.181290 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" event={"ID":"73638d71-c9ed-4ad0-866d-67c36b52de3e","Type":"ContainerStarted","Data":"c08d6a910be048b36cf711a44b6fa49d645e0b3da2edbb22eb72a411b3abc79c"} Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.181325 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" event={"ID":"73638d71-c9ed-4ad0-866d-67c36b52de3e","Type":"ContainerStarted","Data":"136dc78359478b3de167a0d296a7cc0d7f1725f7ebe4995b9e462639813d9f84"} Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.181460 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.182864 4727 generic.go:334] "Generic (PLEG): container finished" podID="e4d11880-82d5-49b0-965a-e2fc54b9c775" containerID="7589925fab3198d500a3ff17eb74351ef1044950b8c9039669e95794b3039a01" exitCode=0 Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.182914 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfxlj" event={"ID":"e4d11880-82d5-49b0-965a-e2fc54b9c775","Type":"ContainerDied","Data":"7589925fab3198d500a3ff17eb74351ef1044950b8c9039669e95794b3039a01"} Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.182943 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfxlj" event={"ID":"e4d11880-82d5-49b0-965a-e2fc54b9c775","Type":"ContainerStarted","Data":"1859e903cb4592d1825d9f77e936bfc24f58e0e2ffcc91f5dd0ffbaee3dbb12b"} Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.210447 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kws6b" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.232424 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.233362 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.236657 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.237172 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.262335 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" podStartSLOduration=130.262317156 podStartE2EDuration="2m10.262317156s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:35.254366556 +0000 UTC m=+153.575721393" watchObservedRunningTime="2025-10-01 12:39:35.262317156 +0000 UTC m=+153.583671993" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.263166 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.299947 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2mhjs"] Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.304833 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2mhjs" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.316761 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ce32e99-94ad-40ce-988d-febe69875dac-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"3ce32e99-94ad-40ce-988d-febe69875dac\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.316824 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ce32e99-94ad-40ce-988d-febe69875dac-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"3ce32e99-94ad-40ce-988d-febe69875dac\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.355507 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2mhjs"] Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.420734 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqxw5\" (UniqueName: \"kubernetes.io/projected/051e51bb-3387-4009-8c88-fd90d76af6e2-kube-api-access-jqxw5\") pod \"redhat-marketplace-2mhjs\" (UID: \"051e51bb-3387-4009-8c88-fd90d76af6e2\") " pod="openshift-marketplace/redhat-marketplace-2mhjs" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.420790 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/051e51bb-3387-4009-8c88-fd90d76af6e2-catalog-content\") pod \"redhat-marketplace-2mhjs\" (UID: \"051e51bb-3387-4009-8c88-fd90d76af6e2\") " pod="openshift-marketplace/redhat-marketplace-2mhjs" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.420815 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/051e51bb-3387-4009-8c88-fd90d76af6e2-utilities\") pod \"redhat-marketplace-2mhjs\" (UID: \"051e51bb-3387-4009-8c88-fd90d76af6e2\") " pod="openshift-marketplace/redhat-marketplace-2mhjs" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.420854 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ce32e99-94ad-40ce-988d-febe69875dac-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"3ce32e99-94ad-40ce-988d-febe69875dac\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.420880 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ce32e99-94ad-40ce-988d-febe69875dac-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"3ce32e99-94ad-40ce-988d-febe69875dac\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.421684 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ce32e99-94ad-40ce-988d-febe69875dac-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"3ce32e99-94ad-40ce-988d-febe69875dac\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.471492 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ce32e99-94ad-40ce-988d-febe69875dac-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"3ce32e99-94ad-40ce-988d-febe69875dac\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.510773 4727 patch_prober.go:28] interesting pod/downloads-7954f5f757-qktm7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.511375 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qktm7" podUID="61ea519c-4d97-4e3e-b932-51a3f8e2e07f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.512340 4727 patch_prober.go:28] interesting pod/downloads-7954f5f757-qktm7 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.512366 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qktm7" podUID="61ea519c-4d97-4e3e-b932-51a3f8e2e07f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.523713 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/051e51bb-3387-4009-8c88-fd90d76af6e2-catalog-content\") pod \"redhat-marketplace-2mhjs\" (UID: \"051e51bb-3387-4009-8c88-fd90d76af6e2\") " pod="openshift-marketplace/redhat-marketplace-2mhjs" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.523777 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/051e51bb-3387-4009-8c88-fd90d76af6e2-utilities\") pod \"redhat-marketplace-2mhjs\" (UID: \"051e51bb-3387-4009-8c88-fd90d76af6e2\") " pod="openshift-marketplace/redhat-marketplace-2mhjs" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.523894 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqxw5\" (UniqueName: \"kubernetes.io/projected/051e51bb-3387-4009-8c88-fd90d76af6e2-kube-api-access-jqxw5\") pod \"redhat-marketplace-2mhjs\" (UID: \"051e51bb-3387-4009-8c88-fd90d76af6e2\") " pod="openshift-marketplace/redhat-marketplace-2mhjs" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.524605 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/051e51bb-3387-4009-8c88-fd90d76af6e2-utilities\") pod \"redhat-marketplace-2mhjs\" (UID: \"051e51bb-3387-4009-8c88-fd90d76af6e2\") " pod="openshift-marketplace/redhat-marketplace-2mhjs" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.524942 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/051e51bb-3387-4009-8c88-fd90d76af6e2-catalog-content\") pod \"redhat-marketplace-2mhjs\" (UID: \"051e51bb-3387-4009-8c88-fd90d76af6e2\") " pod="openshift-marketplace/redhat-marketplace-2mhjs" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.547775 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqxw5\" (UniqueName: \"kubernetes.io/projected/051e51bb-3387-4009-8c88-fd90d76af6e2-kube-api-access-jqxw5\") pod \"redhat-marketplace-2mhjs\" (UID: \"051e51bb-3387-4009-8c88-fd90d76af6e2\") " pod="openshift-marketplace/redhat-marketplace-2mhjs" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.556269 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.560565 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.580805 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-sz95m" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.616641 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kws6b"] Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.625121 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8-kubelet-dir\") pod \"86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8\" (UID: \"86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8\") " Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.625220 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8" (UID: "86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.625287 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8-kube-api-access\") pod \"86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8\" (UID: \"86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8\") " Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.626280 4727 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.629209 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8" (UID: "86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.646049 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2mhjs" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.696232 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-8x58q" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.700879 4727 patch_prober.go:28] interesting pod/router-default-5444994796-8x58q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:39:35 crc kubenswrapper[4727]: [-]has-synced failed: reason withheld Oct 01 12:39:35 crc kubenswrapper[4727]: [+]process-running ok Oct 01 12:39:35 crc kubenswrapper[4727]: healthz check failed Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.700934 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8x58q" podUID="228aaa44-0de4-45e8-87d9-78ad4fa70f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.727966 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.898724 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nc8w8"] Oct 01 12:39:35 crc kubenswrapper[4727]: E1001 12:39:35.899357 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8" containerName="pruner" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.899376 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8" containerName="pruner" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.899488 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8" containerName="pruner" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.900435 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc8w8" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.903212 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.936272 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nc8w8"] Oct 01 12:39:35 crc kubenswrapper[4727]: I1001 12:39:35.940576 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-84klz" Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.055008 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1b0fc07-8033-4220-a491-cc668e795d10-catalog-content\") pod \"redhat-operators-nc8w8\" (UID: \"d1b0fc07-8033-4220-a491-cc668e795d10\") " pod="openshift-marketplace/redhat-operators-nc8w8" Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.055107 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvr5b\" (UniqueName: \"kubernetes.io/projected/d1b0fc07-8033-4220-a491-cc668e795d10-kube-api-access-rvr5b\") pod \"redhat-operators-nc8w8\" (UID: \"d1b0fc07-8033-4220-a491-cc668e795d10\") " pod="openshift-marketplace/redhat-operators-nc8w8" Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.055162 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1b0fc07-8033-4220-a491-cc668e795d10-utilities\") pod \"redhat-operators-nc8w8\" (UID: \"d1b0fc07-8033-4220-a491-cc668e795d10\") " pod="openshift-marketplace/redhat-operators-nc8w8" Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.077196 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.156780 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1b0fc07-8033-4220-a491-cc668e795d10-catalog-content\") pod \"redhat-operators-nc8w8\" (UID: \"d1b0fc07-8033-4220-a491-cc668e795d10\") " pod="openshift-marketplace/redhat-operators-nc8w8" Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.156853 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvr5b\" (UniqueName: \"kubernetes.io/projected/d1b0fc07-8033-4220-a491-cc668e795d10-kube-api-access-rvr5b\") pod \"redhat-operators-nc8w8\" (UID: \"d1b0fc07-8033-4220-a491-cc668e795d10\") " pod="openshift-marketplace/redhat-operators-nc8w8" Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.156896 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1b0fc07-8033-4220-a491-cc668e795d10-utilities\") pod \"redhat-operators-nc8w8\" (UID: \"d1b0fc07-8033-4220-a491-cc668e795d10\") " pod="openshift-marketplace/redhat-operators-nc8w8" Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.157384 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1b0fc07-8033-4220-a491-cc668e795d10-catalog-content\") pod \"redhat-operators-nc8w8\" (UID: \"d1b0fc07-8033-4220-a491-cc668e795d10\") " pod="openshift-marketplace/redhat-operators-nc8w8" Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.157529 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1b0fc07-8033-4220-a491-cc668e795d10-utilities\") pod \"redhat-operators-nc8w8\" (UID: \"d1b0fc07-8033-4220-a491-cc668e795d10\") " pod="openshift-marketplace/redhat-operators-nc8w8" Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.184787 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvr5b\" (UniqueName: \"kubernetes.io/projected/d1b0fc07-8033-4220-a491-cc668e795d10-kube-api-access-rvr5b\") pod \"redhat-operators-nc8w8\" (UID: \"d1b0fc07-8033-4220-a491-cc668e795d10\") " pod="openshift-marketplace/redhat-operators-nc8w8" Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.215376 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2mhjs"] Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.221392 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"86a54538-eb6c-4faa-8fb2-0b6f2e0c7fd8","Type":"ContainerDied","Data":"0ba1b5170eda5a50c8e406e020dd250c4817bd8c6159d13b5736f9cf4df8f0ea"} Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.221430 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ba1b5170eda5a50c8e406e020dd250c4817bd8c6159d13b5736f9cf4df8f0ea" Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.221495 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.231781 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"3ce32e99-94ad-40ce-988d-febe69875dac","Type":"ContainerStarted","Data":"dcb2ff196639962863a7f38372324cd9aeaf4fa17ec81c0893585e104c7e28ed"} Oct 01 12:39:36 crc kubenswrapper[4727]: W1001 12:39:36.244209 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod051e51bb_3387_4009_8c88_fd90d76af6e2.slice/crio-7e52048bf40d26be5c996f699f1b883526d7740622d4513c670c5dd0532b0bb0 WatchSource:0}: Error finding container 7e52048bf40d26be5c996f699f1b883526d7740622d4513c670c5dd0532b0bb0: Status 404 returned error can't find the container with id 7e52048bf40d26be5c996f699f1b883526d7740622d4513c670c5dd0532b0bb0 Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.254311 4727 generic.go:334] "Generic (PLEG): container finished" podID="defd3d6f-dd53-4725-af25-c711790c4870" containerID="4c7269cb9d1c4764fb2d999e192b9cf5c53aa5b732d2b6149e67229b8aea5ed4" exitCode=0 Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.255096 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kws6b" event={"ID":"defd3d6f-dd53-4725-af25-c711790c4870","Type":"ContainerDied","Data":"4c7269cb9d1c4764fb2d999e192b9cf5c53aa5b732d2b6149e67229b8aea5ed4"} Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.255206 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kws6b" event={"ID":"defd3d6f-dd53-4725-af25-c711790c4870","Type":"ContainerStarted","Data":"f179553b92247d17a97d63db8e2672690e4634405ea59032af36e7b5575a061c"} Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.287954 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc8w8" Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.310082 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bkz5m"] Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.311439 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bkz5m" Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.314019 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bkz5m"] Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.468272 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9005845-d890-439c-9b2d-2a80c0f61697-utilities\") pod \"redhat-operators-bkz5m\" (UID: \"b9005845-d890-439c-9b2d-2a80c0f61697\") " pod="openshift-marketplace/redhat-operators-bkz5m" Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.468347 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz2wx\" (UniqueName: \"kubernetes.io/projected/b9005845-d890-439c-9b2d-2a80c0f61697-kube-api-access-jz2wx\") pod \"redhat-operators-bkz5m\" (UID: \"b9005845-d890-439c-9b2d-2a80c0f61697\") " pod="openshift-marketplace/redhat-operators-bkz5m" Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.468419 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9005845-d890-439c-9b2d-2a80c0f61697-catalog-content\") pod \"redhat-operators-bkz5m\" (UID: \"b9005845-d890-439c-9b2d-2a80c0f61697\") " pod="openshift-marketplace/redhat-operators-bkz5m" Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.570505 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9005845-d890-439c-9b2d-2a80c0f61697-utilities\") pod \"redhat-operators-bkz5m\" (UID: \"b9005845-d890-439c-9b2d-2a80c0f61697\") " pod="openshift-marketplace/redhat-operators-bkz5m" Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.571208 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz2wx\" (UniqueName: \"kubernetes.io/projected/b9005845-d890-439c-9b2d-2a80c0f61697-kube-api-access-jz2wx\") pod \"redhat-operators-bkz5m\" (UID: \"b9005845-d890-439c-9b2d-2a80c0f61697\") " pod="openshift-marketplace/redhat-operators-bkz5m" Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.571270 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9005845-d890-439c-9b2d-2a80c0f61697-catalog-content\") pod \"redhat-operators-bkz5m\" (UID: \"b9005845-d890-439c-9b2d-2a80c0f61697\") " pod="openshift-marketplace/redhat-operators-bkz5m" Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.571913 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9005845-d890-439c-9b2d-2a80c0f61697-catalog-content\") pod \"redhat-operators-bkz5m\" (UID: \"b9005845-d890-439c-9b2d-2a80c0f61697\") " pod="openshift-marketplace/redhat-operators-bkz5m" Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.572363 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9005845-d890-439c-9b2d-2a80c0f61697-utilities\") pod \"redhat-operators-bkz5m\" (UID: \"b9005845-d890-439c-9b2d-2a80c0f61697\") " pod="openshift-marketplace/redhat-operators-bkz5m" Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.606220 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz2wx\" (UniqueName: \"kubernetes.io/projected/b9005845-d890-439c-9b2d-2a80c0f61697-kube-api-access-jz2wx\") pod \"redhat-operators-bkz5m\" (UID: \"b9005845-d890-439c-9b2d-2a80c0f61697\") " pod="openshift-marketplace/redhat-operators-bkz5m" Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.720912 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nc8w8"] Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.724707 4727 patch_prober.go:28] interesting pod/router-default-5444994796-8x58q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:39:36 crc kubenswrapper[4727]: [-]has-synced failed: reason withheld Oct 01 12:39:36 crc kubenswrapper[4727]: [+]process-running ok Oct 01 12:39:36 crc kubenswrapper[4727]: healthz check failed Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.724857 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8x58q" podUID="228aaa44-0de4-45e8-87d9-78ad4fa70f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:39:36 crc kubenswrapper[4727]: I1001 12:39:36.791393 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bkz5m" Oct 01 12:39:37 crc kubenswrapper[4727]: I1001 12:39:37.281303 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"3ce32e99-94ad-40ce-988d-febe69875dac","Type":"ContainerStarted","Data":"a3f7fddbadccc62936b5b2bf6e58ad52e36a7b4ae312945e5db9f2a713989060"} Oct 01 12:39:37 crc kubenswrapper[4727]: I1001 12:39:37.332088 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc8w8" event={"ID":"d1b0fc07-8033-4220-a491-cc668e795d10","Type":"ContainerStarted","Data":"3ab9b4d3f81f00309d02784c98d65551bcd1344e6137a042bb8f0cd1159a6054"} Oct 01 12:39:37 crc kubenswrapper[4727]: I1001 12:39:37.332189 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc8w8" event={"ID":"d1b0fc07-8033-4220-a491-cc668e795d10","Type":"ContainerStarted","Data":"c104fdd951a10df6504eeae8ca9d6fa0552f05afe1e9a6b646e2ad2eaa0a43c6"} Oct 01 12:39:37 crc kubenswrapper[4727]: I1001 12:39:37.362920 4727 generic.go:334] "Generic (PLEG): container finished" podID="1974282f-c2f4-48cd-97e2-9e880203ef1c" containerID="d9fb1e4352d7559f140d9f5fdb64bd79ffc4b43f2e90b5d94ac7269b7e83d9b5" exitCode=0 Oct 01 12:39:37 crc kubenswrapper[4727]: I1001 12:39:37.363030 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-zjx2n" event={"ID":"1974282f-c2f4-48cd-97e2-9e880203ef1c","Type":"ContainerDied","Data":"d9fb1e4352d7559f140d9f5fdb64bd79ffc4b43f2e90b5d94ac7269b7e83d9b5"} Oct 01 12:39:37 crc kubenswrapper[4727]: I1001 12:39:37.365854 4727 generic.go:334] "Generic (PLEG): container finished" podID="051e51bb-3387-4009-8c88-fd90d76af6e2" containerID="c938a05c6aa8f25db25adcae2cada7c498426b74e542e0a11b95a7af7434f8e5" exitCode=0 Oct 01 12:39:37 crc kubenswrapper[4727]: I1001 12:39:37.365911 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2mhjs" event={"ID":"051e51bb-3387-4009-8c88-fd90d76af6e2","Type":"ContainerDied","Data":"c938a05c6aa8f25db25adcae2cada7c498426b74e542e0a11b95a7af7434f8e5"} Oct 01 12:39:37 crc kubenswrapper[4727]: I1001 12:39:37.365945 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2mhjs" event={"ID":"051e51bb-3387-4009-8c88-fd90d76af6e2","Type":"ContainerStarted","Data":"7e52048bf40d26be5c996f699f1b883526d7740622d4513c670c5dd0532b0bb0"} Oct 01 12:39:37 crc kubenswrapper[4727]: I1001 12:39:37.383575 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.383551319 podStartE2EDuration="2.383551319s" podCreationTimestamp="2025-10-01 12:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:39:37.309326774 +0000 UTC m=+155.630681621" watchObservedRunningTime="2025-10-01 12:39:37.383551319 +0000 UTC m=+155.704906176" Oct 01 12:39:37 crc kubenswrapper[4727]: I1001 12:39:37.491146 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bkz5m"] Oct 01 12:39:37 crc kubenswrapper[4727]: I1001 12:39:37.699197 4727 patch_prober.go:28] interesting pod/router-default-5444994796-8x58q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:39:37 crc kubenswrapper[4727]: [-]has-synced failed: reason withheld Oct 01 12:39:37 crc kubenswrapper[4727]: [+]process-running ok Oct 01 12:39:37 crc kubenswrapper[4727]: healthz check failed Oct 01 12:39:37 crc kubenswrapper[4727]: I1001 12:39:37.699250 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8x58q" podUID="228aaa44-0de4-45e8-87d9-78ad4fa70f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:39:38 crc kubenswrapper[4727]: I1001 12:39:38.402128 4727 generic.go:334] "Generic (PLEG): container finished" podID="3ce32e99-94ad-40ce-988d-febe69875dac" containerID="a3f7fddbadccc62936b5b2bf6e58ad52e36a7b4ae312945e5db9f2a713989060" exitCode=0 Oct 01 12:39:38 crc kubenswrapper[4727]: I1001 12:39:38.405360 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"3ce32e99-94ad-40ce-988d-febe69875dac","Type":"ContainerDied","Data":"a3f7fddbadccc62936b5b2bf6e58ad52e36a7b4ae312945e5db9f2a713989060"} Oct 01 12:39:38 crc kubenswrapper[4727]: I1001 12:39:38.410151 4727 generic.go:334] "Generic (PLEG): container finished" podID="d1b0fc07-8033-4220-a491-cc668e795d10" containerID="3ab9b4d3f81f00309d02784c98d65551bcd1344e6137a042bb8f0cd1159a6054" exitCode=0 Oct 01 12:39:38 crc kubenswrapper[4727]: I1001 12:39:38.410227 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc8w8" event={"ID":"d1b0fc07-8033-4220-a491-cc668e795d10","Type":"ContainerDied","Data":"3ab9b4d3f81f00309d02784c98d65551bcd1344e6137a042bb8f0cd1159a6054"} Oct 01 12:39:38 crc kubenswrapper[4727]: I1001 12:39:38.435829 4727 generic.go:334] "Generic (PLEG): container finished" podID="b9005845-d890-439c-9b2d-2a80c0f61697" containerID="ceeb1d64f10312bdcee0acbf26f61eedd07ed3c72513d8a563846b73fc27e6cf" exitCode=0 Oct 01 12:39:38 crc kubenswrapper[4727]: I1001 12:39:38.435922 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bkz5m" event={"ID":"b9005845-d890-439c-9b2d-2a80c0f61697","Type":"ContainerDied","Data":"ceeb1d64f10312bdcee0acbf26f61eedd07ed3c72513d8a563846b73fc27e6cf"} Oct 01 12:39:38 crc kubenswrapper[4727]: I1001 12:39:38.436047 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bkz5m" event={"ID":"b9005845-d890-439c-9b2d-2a80c0f61697","Type":"ContainerStarted","Data":"fa7f73a7c80225c2518284708a97142af9bebd86c46fc9d646d4cb996f98a8d9"} Oct 01 12:39:38 crc kubenswrapper[4727]: I1001 12:39:38.710919 4727 patch_prober.go:28] interesting pod/router-default-5444994796-8x58q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:39:38 crc kubenswrapper[4727]: [-]has-synced failed: reason withheld Oct 01 12:39:38 crc kubenswrapper[4727]: [+]process-running ok Oct 01 12:39:38 crc kubenswrapper[4727]: healthz check failed Oct 01 12:39:38 crc kubenswrapper[4727]: I1001 12:39:38.711026 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8x58q" podUID="228aaa44-0de4-45e8-87d9-78ad4fa70f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:39:38 crc kubenswrapper[4727]: I1001 12:39:38.803954 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-zjx2n" Oct 01 12:39:38 crc kubenswrapper[4727]: I1001 12:39:38.928771 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1974282f-c2f4-48cd-97e2-9e880203ef1c-secret-volume\") pod \"1974282f-c2f4-48cd-97e2-9e880203ef1c\" (UID: \"1974282f-c2f4-48cd-97e2-9e880203ef1c\") " Oct 01 12:39:38 crc kubenswrapper[4727]: I1001 12:39:38.928847 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1974282f-c2f4-48cd-97e2-9e880203ef1c-config-volume\") pod \"1974282f-c2f4-48cd-97e2-9e880203ef1c\" (UID: \"1974282f-c2f4-48cd-97e2-9e880203ef1c\") " Oct 01 12:39:38 crc kubenswrapper[4727]: I1001 12:39:38.928932 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snndw\" (UniqueName: \"kubernetes.io/projected/1974282f-c2f4-48cd-97e2-9e880203ef1c-kube-api-access-snndw\") pod \"1974282f-c2f4-48cd-97e2-9e880203ef1c\" (UID: \"1974282f-c2f4-48cd-97e2-9e880203ef1c\") " Oct 01 12:39:38 crc kubenswrapper[4727]: I1001 12:39:38.930422 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1974282f-c2f4-48cd-97e2-9e880203ef1c-config-volume" (OuterVolumeSpecName: "config-volume") pod "1974282f-c2f4-48cd-97e2-9e880203ef1c" (UID: "1974282f-c2f4-48cd-97e2-9e880203ef1c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:39:38 crc kubenswrapper[4727]: I1001 12:39:38.941009 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1974282f-c2f4-48cd-97e2-9e880203ef1c-kube-api-access-snndw" (OuterVolumeSpecName: "kube-api-access-snndw") pod "1974282f-c2f4-48cd-97e2-9e880203ef1c" (UID: "1974282f-c2f4-48cd-97e2-9e880203ef1c"). InnerVolumeSpecName "kube-api-access-snndw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:39:38 crc kubenswrapper[4727]: I1001 12:39:38.952383 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1974282f-c2f4-48cd-97e2-9e880203ef1c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1974282f-c2f4-48cd-97e2-9e880203ef1c" (UID: "1974282f-c2f4-48cd-97e2-9e880203ef1c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:39:39 crc kubenswrapper[4727]: I1001 12:39:39.030184 4727 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1974282f-c2f4-48cd-97e2-9e880203ef1c-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 12:39:39 crc kubenswrapper[4727]: I1001 12:39:39.030259 4727 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1974282f-c2f4-48cd-97e2-9e880203ef1c-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 12:39:39 crc kubenswrapper[4727]: I1001 12:39:39.030273 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snndw\" (UniqueName: \"kubernetes.io/projected/1974282f-c2f4-48cd-97e2-9e880203ef1c-kube-api-access-snndw\") on node \"crc\" DevicePath \"\"" Oct 01 12:39:39 crc kubenswrapper[4727]: I1001 12:39:39.460992 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-zjx2n" Oct 01 12:39:39 crc kubenswrapper[4727]: I1001 12:39:39.464283 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322030-zjx2n" event={"ID":"1974282f-c2f4-48cd-97e2-9e880203ef1c","Type":"ContainerDied","Data":"e7db7b39438668f9f764fc907971dca3c7caf432ac4cc0cd3aa310238c679c34"} Oct 01 12:39:39 crc kubenswrapper[4727]: I1001 12:39:39.464367 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7db7b39438668f9f764fc907971dca3c7caf432ac4cc0cd3aa310238c679c34" Oct 01 12:39:39 crc kubenswrapper[4727]: I1001 12:39:39.701559 4727 patch_prober.go:28] interesting pod/router-default-5444994796-8x58q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 12:39:39 crc kubenswrapper[4727]: [+]has-synced ok Oct 01 12:39:39 crc kubenswrapper[4727]: [+]process-running ok Oct 01 12:39:39 crc kubenswrapper[4727]: healthz check failed Oct 01 12:39:39 crc kubenswrapper[4727]: I1001 12:39:39.701910 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8x58q" podUID="228aaa44-0de4-45e8-87d9-78ad4fa70f2e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 12:39:39 crc kubenswrapper[4727]: I1001 12:39:39.829867 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:39:39 crc kubenswrapper[4727]: I1001 12:39:39.951054 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ce32e99-94ad-40ce-988d-febe69875dac-kubelet-dir\") pod \"3ce32e99-94ad-40ce-988d-febe69875dac\" (UID: \"3ce32e99-94ad-40ce-988d-febe69875dac\") " Oct 01 12:39:39 crc kubenswrapper[4727]: I1001 12:39:39.951153 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ce32e99-94ad-40ce-988d-febe69875dac-kube-api-access\") pod \"3ce32e99-94ad-40ce-988d-febe69875dac\" (UID: \"3ce32e99-94ad-40ce-988d-febe69875dac\") " Oct 01 12:39:39 crc kubenswrapper[4727]: I1001 12:39:39.951246 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ce32e99-94ad-40ce-988d-febe69875dac-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3ce32e99-94ad-40ce-988d-febe69875dac" (UID: "3ce32e99-94ad-40ce-988d-febe69875dac"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:39:39 crc kubenswrapper[4727]: I1001 12:39:39.951489 4727 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ce32e99-94ad-40ce-988d-febe69875dac-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 01 12:39:39 crc kubenswrapper[4727]: I1001 12:39:39.981892 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ce32e99-94ad-40ce-988d-febe69875dac-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3ce32e99-94ad-40ce-988d-febe69875dac" (UID: "3ce32e99-94ad-40ce-988d-febe69875dac"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:39:40 crc kubenswrapper[4727]: I1001 12:39:40.052786 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ce32e99-94ad-40ce-988d-febe69875dac-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 12:39:40 crc kubenswrapper[4727]: I1001 12:39:40.489704 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"3ce32e99-94ad-40ce-988d-febe69875dac","Type":"ContainerDied","Data":"dcb2ff196639962863a7f38372324cd9aeaf4fa17ec81c0893585e104c7e28ed"} Oct 01 12:39:40 crc kubenswrapper[4727]: I1001 12:39:40.490189 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcb2ff196639962863a7f38372324cd9aeaf4fa17ec81c0893585e104c7e28ed" Oct 01 12:39:40 crc kubenswrapper[4727]: I1001 12:39:40.490041 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 12:39:40 crc kubenswrapper[4727]: I1001 12:39:40.703929 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-8x58q" Oct 01 12:39:40 crc kubenswrapper[4727]: I1001 12:39:40.708031 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-8x58q" Oct 01 12:39:40 crc kubenswrapper[4727]: I1001 12:39:40.715484 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-cfmpf" Oct 01 12:39:44 crc kubenswrapper[4727]: I1001 12:39:44.904030 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:39:44 crc kubenswrapper[4727]: I1001 12:39:44.910286 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:39:45 crc kubenswrapper[4727]: I1001 12:39:45.508056 4727 patch_prober.go:28] interesting pod/downloads-7954f5f757-qktm7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Oct 01 12:39:45 crc kubenswrapper[4727]: I1001 12:39:45.508085 4727 patch_prober.go:28] interesting pod/downloads-7954f5f757-qktm7 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Oct 01 12:39:45 crc kubenswrapper[4727]: I1001 12:39:45.508130 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qktm7" podUID="61ea519c-4d97-4e3e-b932-51a3f8e2e07f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Oct 01 12:39:45 crc kubenswrapper[4727]: I1001 12:39:45.508130 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qktm7" podUID="61ea519c-4d97-4e3e-b932-51a3f8e2e07f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Oct 01 12:39:47 crc kubenswrapper[4727]: I1001 12:39:47.793387 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7f4ab8d-5f57-47bd-93fc-9219c596c436-metrics-certs\") pod \"network-metrics-daemon-tvtzh\" (UID: \"f7f4ab8d-5f57-47bd-93fc-9219c596c436\") " pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:39:47 crc kubenswrapper[4727]: I1001 12:39:47.803768 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7f4ab8d-5f57-47bd-93fc-9219c596c436-metrics-certs\") pod \"network-metrics-daemon-tvtzh\" (UID: \"f7f4ab8d-5f57-47bd-93fc-9219c596c436\") " pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:39:48 crc kubenswrapper[4727]: I1001 12:39:48.098719 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvtzh" Oct 01 12:39:53 crc kubenswrapper[4727]: I1001 12:39:53.926431 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:39:55 crc kubenswrapper[4727]: I1001 12:39:55.517992 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-qktm7" Oct 01 12:40:03 crc kubenswrapper[4727]: I1001 12:40:03.293685 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:40:03 crc kubenswrapper[4727]: I1001 12:40:03.294264 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:40:05 crc kubenswrapper[4727]: E1001 12:40:05.505630 4727 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 01 12:40:05 crc kubenswrapper[4727]: E1001 12:40:05.507692 4727 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jqxw5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-2mhjs_openshift-marketplace(051e51bb-3387-4009-8c88-fd90d76af6e2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 12:40:05 crc kubenswrapper[4727]: E1001 12:40:05.509128 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-2mhjs" podUID="051e51bb-3387-4009-8c88-fd90d76af6e2" Oct 01 12:40:05 crc kubenswrapper[4727]: E1001 12:40:05.615083 4727 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 01 12:40:05 crc kubenswrapper[4727]: E1001 12:40:05.615414 4727 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xnnmb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-kws6b_openshift-marketplace(defd3d6f-dd53-4725-af25-c711790c4870): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 12:40:05 crc kubenswrapper[4727]: E1001 12:40:05.616752 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-kws6b" podUID="defd3d6f-dd53-4725-af25-c711790c4870" Oct 01 12:40:05 crc kubenswrapper[4727]: I1001 12:40:05.660586 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g6h55" Oct 01 12:40:08 crc kubenswrapper[4727]: E1001 12:40:08.410390 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-2mhjs" podUID="051e51bb-3387-4009-8c88-fd90d76af6e2" Oct 01 12:40:08 crc kubenswrapper[4727]: E1001 12:40:08.410598 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-kws6b" podUID="defd3d6f-dd53-4725-af25-c711790c4870" Oct 01 12:40:08 crc kubenswrapper[4727]: E1001 12:40:08.501692 4727 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 01 12:40:08 crc kubenswrapper[4727]: E1001 12:40:08.502282 4727 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rvr5b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-nc8w8_openshift-marketplace(d1b0fc07-8033-4220-a491-cc668e795d10): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 12:40:08 crc kubenswrapper[4727]: E1001 12:40:08.503525 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-nc8w8" podUID="d1b0fc07-8033-4220-a491-cc668e795d10" Oct 01 12:40:09 crc kubenswrapper[4727]: E1001 12:40:09.879877 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-nc8w8" podUID="d1b0fc07-8033-4220-a491-cc668e795d10" Oct 01 12:40:09 crc kubenswrapper[4727]: E1001 12:40:09.961443 4727 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 01 12:40:09 crc kubenswrapper[4727]: E1001 12:40:09.961737 4727 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qr9tw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-kfxlj_openshift-marketplace(e4d11880-82d5-49b0-965a-e2fc54b9c775): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 12:40:09 crc kubenswrapper[4727]: E1001 12:40:09.962980 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-kfxlj" podUID="e4d11880-82d5-49b0-965a-e2fc54b9c775" Oct 01 12:40:11 crc kubenswrapper[4727]: I1001 12:40:11.416903 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 12:40:11 crc kubenswrapper[4727]: E1001 12:40:11.477267 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-kfxlj" podUID="e4d11880-82d5-49b0-965a-e2fc54b9c775" Oct 01 12:40:11 crc kubenswrapper[4727]: E1001 12:40:11.566050 4727 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 01 12:40:11 crc kubenswrapper[4727]: E1001 12:40:11.566307 4727 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g4l2d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-8l85n_openshift-marketplace(8548d350-ee32-44e6-85d2-2e30036d5eb8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 12:40:11 crc kubenswrapper[4727]: E1001 12:40:11.568175 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-8l85n" podUID="8548d350-ee32-44e6-85d2-2e30036d5eb8" Oct 01 12:40:11 crc kubenswrapper[4727]: E1001 12:40:11.597482 4727 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 01 12:40:11 crc kubenswrapper[4727]: E1001 12:40:11.597618 4727 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r4h42,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-hj5p4_openshift-marketplace(c9a39814-1723-46e2-b468-67e6cf668788): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 12:40:11 crc kubenswrapper[4727]: E1001 12:40:11.598879 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-hj5p4" podUID="c9a39814-1723-46e2-b468-67e6cf668788" Oct 01 12:40:11 crc kubenswrapper[4727]: E1001 12:40:11.619199 4727 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 01 12:40:11 crc kubenswrapper[4727]: E1001 12:40:11.619542 4727 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fkk2g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-lrd89_openshift-marketplace(c3beff22-e67e-4639-9562-3663809167d7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 12:40:11 crc kubenswrapper[4727]: E1001 12:40:11.620845 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-lrd89" podUID="c3beff22-e67e-4639-9562-3663809167d7" Oct 01 12:40:11 crc kubenswrapper[4727]: E1001 12:40:11.733495 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8l85n" podUID="8548d350-ee32-44e6-85d2-2e30036d5eb8" Oct 01 12:40:11 crc kubenswrapper[4727]: E1001 12:40:11.734120 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-lrd89" podUID="c3beff22-e67e-4639-9562-3663809167d7" Oct 01 12:40:11 crc kubenswrapper[4727]: E1001 12:40:11.734616 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-hj5p4" podUID="c9a39814-1723-46e2-b468-67e6cf668788" Oct 01 12:40:11 crc kubenswrapper[4727]: I1001 12:40:11.953436 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tvtzh"] Oct 01 12:40:11 crc kubenswrapper[4727]: W1001 12:40:11.966513 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7f4ab8d_5f57_47bd_93fc_9219c596c436.slice/crio-9f7193a8cc7f99474ecd86b0f031f5b38384e7fc2669ba7a708568127b92464b WatchSource:0}: Error finding container 9f7193a8cc7f99474ecd86b0f031f5b38384e7fc2669ba7a708568127b92464b: Status 404 returned error can't find the container with id 9f7193a8cc7f99474ecd86b0f031f5b38384e7fc2669ba7a708568127b92464b Oct 01 12:40:12 crc kubenswrapper[4727]: I1001 12:40:12.743152 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tvtzh" event={"ID":"f7f4ab8d-5f57-47bd-93fc-9219c596c436","Type":"ContainerStarted","Data":"01a691c5a3b681ce0fb0d307a78c36e4743a9d0d1d4b5ba8beabbe9ee468a35a"} Oct 01 12:40:12 crc kubenswrapper[4727]: I1001 12:40:12.745298 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tvtzh" event={"ID":"f7f4ab8d-5f57-47bd-93fc-9219c596c436","Type":"ContainerStarted","Data":"a23cc2d099b52b15202ce14880fb8cf60d2bc711a5b5ce480b6a1273c1ed4f39"} Oct 01 12:40:12 crc kubenswrapper[4727]: I1001 12:40:12.745454 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tvtzh" event={"ID":"f7f4ab8d-5f57-47bd-93fc-9219c596c436","Type":"ContainerStarted","Data":"9f7193a8cc7f99474ecd86b0f031f5b38384e7fc2669ba7a708568127b92464b"} Oct 01 12:40:12 crc kubenswrapper[4727]: I1001 12:40:12.746938 4727 generic.go:334] "Generic (PLEG): container finished" podID="b9005845-d890-439c-9b2d-2a80c0f61697" containerID="9323a2140b877018cce5d02fab48a09b394e375f07e50934a0031551a69de715" exitCode=0 Oct 01 12:40:12 crc kubenswrapper[4727]: I1001 12:40:12.747089 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bkz5m" event={"ID":"b9005845-d890-439c-9b2d-2a80c0f61697","Type":"ContainerDied","Data":"9323a2140b877018cce5d02fab48a09b394e375f07e50934a0031551a69de715"} Oct 01 12:40:12 crc kubenswrapper[4727]: I1001 12:40:12.784679 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-tvtzh" podStartSLOduration=167.784646154 podStartE2EDuration="2m47.784646154s" podCreationTimestamp="2025-10-01 12:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:40:12.772070772 +0000 UTC m=+191.093425619" watchObservedRunningTime="2025-10-01 12:40:12.784646154 +0000 UTC m=+191.106001021" Oct 01 12:40:13 crc kubenswrapper[4727]: I1001 12:40:13.769557 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bkz5m" event={"ID":"b9005845-d890-439c-9b2d-2a80c0f61697","Type":"ContainerStarted","Data":"61f2923717f79a25ff6d1aff2b54a28bdf5ca454e85032b4d234b2195417a37b"} Oct 01 12:40:16 crc kubenswrapper[4727]: I1001 12:40:16.793066 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bkz5m" Oct 01 12:40:16 crc kubenswrapper[4727]: I1001 12:40:16.793616 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bkz5m" Oct 01 12:40:17 crc kubenswrapper[4727]: I1001 12:40:17.986614 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bkz5m" podUID="b9005845-d890-439c-9b2d-2a80c0f61697" containerName="registry-server" probeResult="failure" output=< Oct 01 12:40:17 crc kubenswrapper[4727]: timeout: failed to connect service ":50051" within 1s Oct 01 12:40:17 crc kubenswrapper[4727]: > Oct 01 12:40:21 crc kubenswrapper[4727]: I1001 12:40:21.417238 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bkz5m" podStartSLOduration=11.648700025 podStartE2EDuration="45.417197429s" podCreationTimestamp="2025-10-01 12:39:36 +0000 UTC" firstStartedPulling="2025-10-01 12:39:39.472810405 +0000 UTC m=+157.794165242" lastFinishedPulling="2025-10-01 12:40:13.241307769 +0000 UTC m=+191.562662646" observedRunningTime="2025-10-01 12:40:13.801779214 +0000 UTC m=+192.123134061" watchObservedRunningTime="2025-10-01 12:40:21.417197429 +0000 UTC m=+199.738552346" Oct 01 12:40:23 crc kubenswrapper[4727]: I1001 12:40:23.839458 4727 generic.go:334] "Generic (PLEG): container finished" podID="051e51bb-3387-4009-8c88-fd90d76af6e2" containerID="3752d1dbf7c67719908ee77ea76add2765afb018530bf3dca3e0c22a7649a460" exitCode=0 Oct 01 12:40:23 crc kubenswrapper[4727]: I1001 12:40:23.839537 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2mhjs" event={"ID":"051e51bb-3387-4009-8c88-fd90d76af6e2","Type":"ContainerDied","Data":"3752d1dbf7c67719908ee77ea76add2765afb018530bf3dca3e0c22a7649a460"} Oct 01 12:40:23 crc kubenswrapper[4727]: I1001 12:40:23.844289 4727 generic.go:334] "Generic (PLEG): container finished" podID="defd3d6f-dd53-4725-af25-c711790c4870" containerID="7ba087df4c3d5b49eea3df564075bfe4a20840ba5e5f322cc9646e9485e1e7d4" exitCode=0 Oct 01 12:40:23 crc kubenswrapper[4727]: I1001 12:40:23.844316 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kws6b" event={"ID":"defd3d6f-dd53-4725-af25-c711790c4870","Type":"ContainerDied","Data":"7ba087df4c3d5b49eea3df564075bfe4a20840ba5e5f322cc9646e9485e1e7d4"} Oct 01 12:40:24 crc kubenswrapper[4727]: I1001 12:40:24.855132 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc8w8" event={"ID":"d1b0fc07-8033-4220-a491-cc668e795d10","Type":"ContainerStarted","Data":"dca42bdb1470accb73b6ee487d638512b588f9a2fdc4c0365068a922a79016e4"} Oct 01 12:40:25 crc kubenswrapper[4727]: I1001 12:40:25.866340 4727 generic.go:334] "Generic (PLEG): container finished" podID="e4d11880-82d5-49b0-965a-e2fc54b9c775" containerID="87c5fa19491de0495d5f85a85309747ffa9631cef4240ce44a3a38422540685a" exitCode=0 Oct 01 12:40:25 crc kubenswrapper[4727]: I1001 12:40:25.866436 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfxlj" event={"ID":"e4d11880-82d5-49b0-965a-e2fc54b9c775","Type":"ContainerDied","Data":"87c5fa19491de0495d5f85a85309747ffa9631cef4240ce44a3a38422540685a"} Oct 01 12:40:25 crc kubenswrapper[4727]: I1001 12:40:25.877221 4727 generic.go:334] "Generic (PLEG): container finished" podID="d1b0fc07-8033-4220-a491-cc668e795d10" containerID="dca42bdb1470accb73b6ee487d638512b588f9a2fdc4c0365068a922a79016e4" exitCode=0 Oct 01 12:40:25 crc kubenswrapper[4727]: I1001 12:40:25.877323 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc8w8" event={"ID":"d1b0fc07-8033-4220-a491-cc668e795d10","Type":"ContainerDied","Data":"dca42bdb1470accb73b6ee487d638512b588f9a2fdc4c0365068a922a79016e4"} Oct 01 12:40:26 crc kubenswrapper[4727]: I1001 12:40:26.858301 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bkz5m" Oct 01 12:40:26 crc kubenswrapper[4727]: I1001 12:40:26.886944 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kws6b" event={"ID":"defd3d6f-dd53-4725-af25-c711790c4870","Type":"ContainerStarted","Data":"18bd77756476c61358c6bdf394d0772e799ba9394234aadc913ca40d51514782"} Oct 01 12:40:26 crc kubenswrapper[4727]: I1001 12:40:26.888793 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2mhjs" event={"ID":"051e51bb-3387-4009-8c88-fd90d76af6e2","Type":"ContainerStarted","Data":"2c5e3c11b7a64660c2f720f2166210cdc66415ae1ba14a6a7888d9345a03863b"} Oct 01 12:40:26 crc kubenswrapper[4727]: I1001 12:40:26.895177 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfxlj" event={"ID":"e4d11880-82d5-49b0-965a-e2fc54b9c775","Type":"ContainerStarted","Data":"7c31c0aa328a02a53a8bd144d36a7ed31d9d40ec2a2c84d342660b03f95ac0c8"} Oct 01 12:40:26 crc kubenswrapper[4727]: I1001 12:40:26.914262 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kws6b" podStartSLOduration=3.6271971670000003 podStartE2EDuration="52.914242779s" podCreationTimestamp="2025-10-01 12:39:34 +0000 UTC" firstStartedPulling="2025-10-01 12:39:36.353713571 +0000 UTC m=+154.675068408" lastFinishedPulling="2025-10-01 12:40:25.640759183 +0000 UTC m=+203.962114020" observedRunningTime="2025-10-01 12:40:26.911866203 +0000 UTC m=+205.233221040" watchObservedRunningTime="2025-10-01 12:40:26.914242779 +0000 UTC m=+205.235597616" Oct 01 12:40:26 crc kubenswrapper[4727]: I1001 12:40:26.921688 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bkz5m" Oct 01 12:40:26 crc kubenswrapper[4727]: I1001 12:40:26.930055 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kfxlj" podStartSLOduration=2.85790123 podStartE2EDuration="53.930029544s" podCreationTimestamp="2025-10-01 12:39:33 +0000 UTC" firstStartedPulling="2025-10-01 12:39:35.185081938 +0000 UTC m=+153.506436775" lastFinishedPulling="2025-10-01 12:40:26.257210252 +0000 UTC m=+204.578565089" observedRunningTime="2025-10-01 12:40:26.929090065 +0000 UTC m=+205.250444902" watchObservedRunningTime="2025-10-01 12:40:26.930029544 +0000 UTC m=+205.251384381" Oct 01 12:40:26 crc kubenswrapper[4727]: I1001 12:40:26.947604 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2mhjs" podStartSLOduration=3.636117404 podStartE2EDuration="51.947579476s" podCreationTimestamp="2025-10-01 12:39:35 +0000 UTC" firstStartedPulling="2025-10-01 12:39:37.376875343 +0000 UTC m=+155.698230170" lastFinishedPulling="2025-10-01 12:40:25.688337405 +0000 UTC m=+204.009692242" observedRunningTime="2025-10-01 12:40:26.943446964 +0000 UTC m=+205.264801811" watchObservedRunningTime="2025-10-01 12:40:26.947579476 +0000 UTC m=+205.268934333" Oct 01 12:40:28 crc kubenswrapper[4727]: I1001 12:40:28.911317 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc8w8" event={"ID":"d1b0fc07-8033-4220-a491-cc668e795d10","Type":"ContainerStarted","Data":"67bd348f31ceed3cf09214d5e8eed4695980248ab7db8d27627ffcb7df579f9b"} Oct 01 12:40:29 crc kubenswrapper[4727]: I1001 12:40:29.943965 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nc8w8" podStartSLOduration=4.13486667 podStartE2EDuration="54.943943675s" podCreationTimestamp="2025-10-01 12:39:35 +0000 UTC" firstStartedPulling="2025-10-01 12:39:37.343480246 +0000 UTC m=+155.664835073" lastFinishedPulling="2025-10-01 12:40:28.152557241 +0000 UTC m=+206.473912078" observedRunningTime="2025-10-01 12:40:29.94192142 +0000 UTC m=+208.263276277" watchObservedRunningTime="2025-10-01 12:40:29.943943675 +0000 UTC m=+208.265298532" Oct 01 12:40:30 crc kubenswrapper[4727]: I1001 12:40:30.230686 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bkz5m"] Oct 01 12:40:30 crc kubenswrapper[4727]: I1001 12:40:30.231376 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bkz5m" podUID="b9005845-d890-439c-9b2d-2a80c0f61697" containerName="registry-server" containerID="cri-o://61f2923717f79a25ff6d1aff2b54a28bdf5ca454e85032b4d234b2195417a37b" gracePeriod=2 Oct 01 12:40:31 crc kubenswrapper[4727]: I1001 12:40:31.676591 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bkz5m" Oct 01 12:40:31 crc kubenswrapper[4727]: I1001 12:40:31.696058 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9005845-d890-439c-9b2d-2a80c0f61697-catalog-content\") pod \"b9005845-d890-439c-9b2d-2a80c0f61697\" (UID: \"b9005845-d890-439c-9b2d-2a80c0f61697\") " Oct 01 12:40:31 crc kubenswrapper[4727]: I1001 12:40:31.696157 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9005845-d890-439c-9b2d-2a80c0f61697-utilities\") pod \"b9005845-d890-439c-9b2d-2a80c0f61697\" (UID: \"b9005845-d890-439c-9b2d-2a80c0f61697\") " Oct 01 12:40:31 crc kubenswrapper[4727]: I1001 12:40:31.696203 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz2wx\" (UniqueName: \"kubernetes.io/projected/b9005845-d890-439c-9b2d-2a80c0f61697-kube-api-access-jz2wx\") pod \"b9005845-d890-439c-9b2d-2a80c0f61697\" (UID: \"b9005845-d890-439c-9b2d-2a80c0f61697\") " Oct 01 12:40:31 crc kubenswrapper[4727]: I1001 12:40:31.698562 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9005845-d890-439c-9b2d-2a80c0f61697-utilities" (OuterVolumeSpecName: "utilities") pod "b9005845-d890-439c-9b2d-2a80c0f61697" (UID: "b9005845-d890-439c-9b2d-2a80c0f61697"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:40:31 crc kubenswrapper[4727]: I1001 12:40:31.704985 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9005845-d890-439c-9b2d-2a80c0f61697-kube-api-access-jz2wx" (OuterVolumeSpecName: "kube-api-access-jz2wx") pod "b9005845-d890-439c-9b2d-2a80c0f61697" (UID: "b9005845-d890-439c-9b2d-2a80c0f61697"). InnerVolumeSpecName "kube-api-access-jz2wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:40:31 crc kubenswrapper[4727]: I1001 12:40:31.797243 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9005845-d890-439c-9b2d-2a80c0f61697-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9005845-d890-439c-9b2d-2a80c0f61697" (UID: "b9005845-d890-439c-9b2d-2a80c0f61697"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:40:31 crc kubenswrapper[4727]: I1001 12:40:31.798033 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9005845-d890-439c-9b2d-2a80c0f61697-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:40:31 crc kubenswrapper[4727]: I1001 12:40:31.798057 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9005845-d890-439c-9b2d-2a80c0f61697-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:40:31 crc kubenswrapper[4727]: I1001 12:40:31.798071 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz2wx\" (UniqueName: \"kubernetes.io/projected/b9005845-d890-439c-9b2d-2a80c0f61697-kube-api-access-jz2wx\") on node \"crc\" DevicePath \"\"" Oct 01 12:40:31 crc kubenswrapper[4727]: I1001 12:40:31.941286 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrd89" event={"ID":"c3beff22-e67e-4639-9562-3663809167d7","Type":"ContainerStarted","Data":"328139b556e248f7536cae4f1d6cbf86c442cba2895246222d5b15eb7483c373"} Oct 01 12:40:31 crc kubenswrapper[4727]: I1001 12:40:31.944520 4727 generic.go:334] "Generic (PLEG): container finished" podID="b9005845-d890-439c-9b2d-2a80c0f61697" containerID="61f2923717f79a25ff6d1aff2b54a28bdf5ca454e85032b4d234b2195417a37b" exitCode=0 Oct 01 12:40:31 crc kubenswrapper[4727]: I1001 12:40:31.944593 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bkz5m" event={"ID":"b9005845-d890-439c-9b2d-2a80c0f61697","Type":"ContainerDied","Data":"61f2923717f79a25ff6d1aff2b54a28bdf5ca454e85032b4d234b2195417a37b"} Oct 01 12:40:31 crc kubenswrapper[4727]: I1001 12:40:31.944602 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bkz5m" Oct 01 12:40:31 crc kubenswrapper[4727]: I1001 12:40:31.944637 4727 scope.go:117] "RemoveContainer" containerID="61f2923717f79a25ff6d1aff2b54a28bdf5ca454e85032b4d234b2195417a37b" Oct 01 12:40:31 crc kubenswrapper[4727]: I1001 12:40:31.944623 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bkz5m" event={"ID":"b9005845-d890-439c-9b2d-2a80c0f61697","Type":"ContainerDied","Data":"fa7f73a7c80225c2518284708a97142af9bebd86c46fc9d646d4cb996f98a8d9"} Oct 01 12:40:31 crc kubenswrapper[4727]: I1001 12:40:31.947964 4727 generic.go:334] "Generic (PLEG): container finished" podID="c9a39814-1723-46e2-b468-67e6cf668788" containerID="d81cdce9e5c45d540de27903de21675e1fecb5452e4c96b593a707a5974344ed" exitCode=0 Oct 01 12:40:31 crc kubenswrapper[4727]: I1001 12:40:31.948121 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hj5p4" event={"ID":"c9a39814-1723-46e2-b468-67e6cf668788","Type":"ContainerDied","Data":"d81cdce9e5c45d540de27903de21675e1fecb5452e4c96b593a707a5974344ed"} Oct 01 12:40:31 crc kubenswrapper[4727]: I1001 12:40:31.951829 4727 generic.go:334] "Generic (PLEG): container finished" podID="8548d350-ee32-44e6-85d2-2e30036d5eb8" containerID="92f21923a9b1375915ebed149f16ea09a8db36ce0e0c84ce69b251086f97b2e3" exitCode=0 Oct 01 12:40:31 crc kubenswrapper[4727]: I1001 12:40:31.951862 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8l85n" event={"ID":"8548d350-ee32-44e6-85d2-2e30036d5eb8","Type":"ContainerDied","Data":"92f21923a9b1375915ebed149f16ea09a8db36ce0e0c84ce69b251086f97b2e3"} Oct 01 12:40:32 crc kubenswrapper[4727]: I1001 12:40:32.128326 4727 scope.go:117] "RemoveContainer" containerID="9323a2140b877018cce5d02fab48a09b394e375f07e50934a0031551a69de715" Oct 01 12:40:32 crc kubenswrapper[4727]: I1001 12:40:32.140891 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bkz5m"] Oct 01 12:40:32 crc kubenswrapper[4727]: I1001 12:40:32.147083 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bkz5m"] Oct 01 12:40:32 crc kubenswrapper[4727]: I1001 12:40:32.148926 4727 scope.go:117] "RemoveContainer" containerID="ceeb1d64f10312bdcee0acbf26f61eedd07ed3c72513d8a563846b73fc27e6cf" Oct 01 12:40:32 crc kubenswrapper[4727]: I1001 12:40:32.165923 4727 scope.go:117] "RemoveContainer" containerID="61f2923717f79a25ff6d1aff2b54a28bdf5ca454e85032b4d234b2195417a37b" Oct 01 12:40:32 crc kubenswrapper[4727]: E1001 12:40:32.166714 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61f2923717f79a25ff6d1aff2b54a28bdf5ca454e85032b4d234b2195417a37b\": container with ID starting with 61f2923717f79a25ff6d1aff2b54a28bdf5ca454e85032b4d234b2195417a37b not found: ID does not exist" containerID="61f2923717f79a25ff6d1aff2b54a28bdf5ca454e85032b4d234b2195417a37b" Oct 01 12:40:32 crc kubenswrapper[4727]: I1001 12:40:32.166750 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61f2923717f79a25ff6d1aff2b54a28bdf5ca454e85032b4d234b2195417a37b"} err="failed to get container status \"61f2923717f79a25ff6d1aff2b54a28bdf5ca454e85032b4d234b2195417a37b\": rpc error: code = NotFound desc = could not find container \"61f2923717f79a25ff6d1aff2b54a28bdf5ca454e85032b4d234b2195417a37b\": container with ID starting with 61f2923717f79a25ff6d1aff2b54a28bdf5ca454e85032b4d234b2195417a37b not found: ID does not exist" Oct 01 12:40:32 crc kubenswrapper[4727]: I1001 12:40:32.166808 4727 scope.go:117] "RemoveContainer" containerID="9323a2140b877018cce5d02fab48a09b394e375f07e50934a0031551a69de715" Oct 01 12:40:32 crc kubenswrapper[4727]: E1001 12:40:32.167097 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9323a2140b877018cce5d02fab48a09b394e375f07e50934a0031551a69de715\": container with ID starting with 9323a2140b877018cce5d02fab48a09b394e375f07e50934a0031551a69de715 not found: ID does not exist" containerID="9323a2140b877018cce5d02fab48a09b394e375f07e50934a0031551a69de715" Oct 01 12:40:32 crc kubenswrapper[4727]: I1001 12:40:32.167144 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9323a2140b877018cce5d02fab48a09b394e375f07e50934a0031551a69de715"} err="failed to get container status \"9323a2140b877018cce5d02fab48a09b394e375f07e50934a0031551a69de715\": rpc error: code = NotFound desc = could not find container \"9323a2140b877018cce5d02fab48a09b394e375f07e50934a0031551a69de715\": container with ID starting with 9323a2140b877018cce5d02fab48a09b394e375f07e50934a0031551a69de715 not found: ID does not exist" Oct 01 12:40:32 crc kubenswrapper[4727]: I1001 12:40:32.167174 4727 scope.go:117] "RemoveContainer" containerID="ceeb1d64f10312bdcee0acbf26f61eedd07ed3c72513d8a563846b73fc27e6cf" Oct 01 12:40:32 crc kubenswrapper[4727]: E1001 12:40:32.167480 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceeb1d64f10312bdcee0acbf26f61eedd07ed3c72513d8a563846b73fc27e6cf\": container with ID starting with ceeb1d64f10312bdcee0acbf26f61eedd07ed3c72513d8a563846b73fc27e6cf not found: ID does not exist" containerID="ceeb1d64f10312bdcee0acbf26f61eedd07ed3c72513d8a563846b73fc27e6cf" Oct 01 12:40:32 crc kubenswrapper[4727]: I1001 12:40:32.167511 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceeb1d64f10312bdcee0acbf26f61eedd07ed3c72513d8a563846b73fc27e6cf"} err="failed to get container status \"ceeb1d64f10312bdcee0acbf26f61eedd07ed3c72513d8a563846b73fc27e6cf\": rpc error: code = NotFound desc = could not find container \"ceeb1d64f10312bdcee0acbf26f61eedd07ed3c72513d8a563846b73fc27e6cf\": container with ID starting with ceeb1d64f10312bdcee0acbf26f61eedd07ed3c72513d8a563846b73fc27e6cf not found: ID does not exist" Oct 01 12:40:32 crc kubenswrapper[4727]: I1001 12:40:32.379818 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9005845-d890-439c-9b2d-2a80c0f61697" path="/var/lib/kubelet/pods/b9005845-d890-439c-9b2d-2a80c0f61697/volumes" Oct 01 12:40:32 crc kubenswrapper[4727]: I1001 12:40:32.958791 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8l85n" event={"ID":"8548d350-ee32-44e6-85d2-2e30036d5eb8","Type":"ContainerStarted","Data":"3a20d53158f6fc563f553367162de3e24f0261c83d1f7cb33453c517997897ef"} Oct 01 12:40:32 crc kubenswrapper[4727]: I1001 12:40:32.962432 4727 generic.go:334] "Generic (PLEG): container finished" podID="c3beff22-e67e-4639-9562-3663809167d7" containerID="328139b556e248f7536cae4f1d6cbf86c442cba2895246222d5b15eb7483c373" exitCode=0 Oct 01 12:40:32 crc kubenswrapper[4727]: I1001 12:40:32.962499 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrd89" event={"ID":"c3beff22-e67e-4639-9562-3663809167d7","Type":"ContainerDied","Data":"328139b556e248f7536cae4f1d6cbf86c442cba2895246222d5b15eb7483c373"} Oct 01 12:40:32 crc kubenswrapper[4727]: I1001 12:40:32.962532 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrd89" event={"ID":"c3beff22-e67e-4639-9562-3663809167d7","Type":"ContainerStarted","Data":"4423507370b9f15f1fd4ce2f150ed89690f35d0791769d43493d9e2ac22a7f2e"} Oct 01 12:40:32 crc kubenswrapper[4727]: I1001 12:40:32.966394 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hj5p4" event={"ID":"c9a39814-1723-46e2-b468-67e6cf668788","Type":"ContainerStarted","Data":"9bb260d3282e05f323820fb3d0e0a921868d17b6516b38642c22d4bc25aede31"} Oct 01 12:40:32 crc kubenswrapper[4727]: I1001 12:40:32.978659 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8l85n" podStartSLOduration=2.381737757 podStartE2EDuration="1m0.97864154s" podCreationTimestamp="2025-10-01 12:39:32 +0000 UTC" firstStartedPulling="2025-10-01 12:39:34.097841975 +0000 UTC m=+152.419196812" lastFinishedPulling="2025-10-01 12:40:32.694745758 +0000 UTC m=+211.016100595" observedRunningTime="2025-10-01 12:40:32.976750608 +0000 UTC m=+211.298105445" watchObservedRunningTime="2025-10-01 12:40:32.97864154 +0000 UTC m=+211.299996377" Oct 01 12:40:33 crc kubenswrapper[4727]: I1001 12:40:33.002666 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hj5p4" podStartSLOduration=2.545440458 podStartE2EDuration="1m1.002648347s" podCreationTimestamp="2025-10-01 12:39:32 +0000 UTC" firstStartedPulling="2025-10-01 12:39:34.17847881 +0000 UTC m=+152.499833647" lastFinishedPulling="2025-10-01 12:40:32.635686699 +0000 UTC m=+210.957041536" observedRunningTime="2025-10-01 12:40:33.000487948 +0000 UTC m=+211.321842795" watchObservedRunningTime="2025-10-01 12:40:33.002648347 +0000 UTC m=+211.324003184" Oct 01 12:40:33 crc kubenswrapper[4727]: I1001 12:40:33.019648 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lrd89" podStartSLOduration=2.432944 podStartE2EDuration="1m0.019630651s" podCreationTimestamp="2025-10-01 12:39:33 +0000 UTC" firstStartedPulling="2025-10-01 12:39:35.181461795 +0000 UTC m=+153.502816662" lastFinishedPulling="2025-10-01 12:40:32.768148476 +0000 UTC m=+211.089503313" observedRunningTime="2025-10-01 12:40:33.016045806 +0000 UTC m=+211.337400653" watchObservedRunningTime="2025-10-01 12:40:33.019630651 +0000 UTC m=+211.340985488" Oct 01 12:40:33 crc kubenswrapper[4727]: I1001 12:40:33.129378 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8l85n" Oct 01 12:40:33 crc kubenswrapper[4727]: I1001 12:40:33.129433 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8l85n" Oct 01 12:40:33 crc kubenswrapper[4727]: I1001 12:40:33.242029 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hj5p4" Oct 01 12:40:33 crc kubenswrapper[4727]: I1001 12:40:33.242114 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hj5p4" Oct 01 12:40:33 crc kubenswrapper[4727]: I1001 12:40:33.292486 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:40:33 crc kubenswrapper[4727]: I1001 12:40:33.292561 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:40:33 crc kubenswrapper[4727]: I1001 12:40:33.292611 4727 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" Oct 01 12:40:33 crc kubenswrapper[4727]: I1001 12:40:33.293259 4727 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca"} pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 12:40:33 crc kubenswrapper[4727]: I1001 12:40:33.293321 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" containerID="cri-o://d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca" gracePeriod=600 Oct 01 12:40:33 crc kubenswrapper[4727]: I1001 12:40:33.702835 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kfxlj" Oct 01 12:40:33 crc kubenswrapper[4727]: I1001 12:40:33.703437 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kfxlj" Oct 01 12:40:33 crc kubenswrapper[4727]: I1001 12:40:33.723146 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lrd89" Oct 01 12:40:33 crc kubenswrapper[4727]: I1001 12:40:33.723414 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lrd89" Oct 01 12:40:33 crc kubenswrapper[4727]: I1001 12:40:33.766593 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kfxlj" Oct 01 12:40:33 crc kubenswrapper[4727]: I1001 12:40:33.977133 4727 generic.go:334] "Generic (PLEG): container finished" podID="d18290ae-64a5-44a5-a704-90977d85852b" containerID="d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca" exitCode=0 Oct 01 12:40:33 crc kubenswrapper[4727]: I1001 12:40:33.977222 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" event={"ID":"d18290ae-64a5-44a5-a704-90977d85852b","Type":"ContainerDied","Data":"d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca"} Oct 01 12:40:33 crc kubenswrapper[4727]: I1001 12:40:33.977746 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" event={"ID":"d18290ae-64a5-44a5-a704-90977d85852b","Type":"ContainerStarted","Data":"ff528fc413a67120cbfce88f98833b8fdf8ba19775f84a05229bef0f923e8a19"} Oct 01 12:40:34 crc kubenswrapper[4727]: I1001 12:40:34.043226 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kfxlj" Oct 01 12:40:34 crc kubenswrapper[4727]: I1001 12:40:34.171862 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-8l85n" podUID="8548d350-ee32-44e6-85d2-2e30036d5eb8" containerName="registry-server" probeResult="failure" output=< Oct 01 12:40:34 crc kubenswrapper[4727]: timeout: failed to connect service ":50051" within 1s Oct 01 12:40:34 crc kubenswrapper[4727]: > Oct 01 12:40:34 crc kubenswrapper[4727]: I1001 12:40:34.277066 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-hj5p4" podUID="c9a39814-1723-46e2-b468-67e6cf668788" containerName="registry-server" probeResult="failure" output=< Oct 01 12:40:34 crc kubenswrapper[4727]: timeout: failed to connect service ":50051" within 1s Oct 01 12:40:34 crc kubenswrapper[4727]: > Oct 01 12:40:34 crc kubenswrapper[4727]: I1001 12:40:34.785646 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-lrd89" podUID="c3beff22-e67e-4639-9562-3663809167d7" containerName="registry-server" probeResult="failure" output=< Oct 01 12:40:34 crc kubenswrapper[4727]: timeout: failed to connect service ":50051" within 1s Oct 01 12:40:34 crc kubenswrapper[4727]: > Oct 01 12:40:35 crc kubenswrapper[4727]: I1001 12:40:35.211977 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kws6b" Oct 01 12:40:35 crc kubenswrapper[4727]: I1001 12:40:35.212410 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kws6b" Oct 01 12:40:35 crc kubenswrapper[4727]: I1001 12:40:35.227197 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kfxlj"] Oct 01 12:40:35 crc kubenswrapper[4727]: I1001 12:40:35.260437 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kws6b" Oct 01 12:40:35 crc kubenswrapper[4727]: I1001 12:40:35.647434 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2mhjs" Oct 01 12:40:35 crc kubenswrapper[4727]: I1001 12:40:35.649184 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2mhjs" Oct 01 12:40:35 crc kubenswrapper[4727]: I1001 12:40:35.696953 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2mhjs" Oct 01 12:40:35 crc kubenswrapper[4727]: I1001 12:40:35.989663 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kfxlj" podUID="e4d11880-82d5-49b0-965a-e2fc54b9c775" containerName="registry-server" containerID="cri-o://7c31c0aa328a02a53a8bd144d36a7ed31d9d40ec2a2c84d342660b03f95ac0c8" gracePeriod=2 Oct 01 12:40:36 crc kubenswrapper[4727]: I1001 12:40:36.046666 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kws6b" Oct 01 12:40:36 crc kubenswrapper[4727]: I1001 12:40:36.075242 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2mhjs" Oct 01 12:40:36 crc kubenswrapper[4727]: I1001 12:40:36.288940 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nc8w8" Oct 01 12:40:36 crc kubenswrapper[4727]: I1001 12:40:36.289027 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nc8w8" Oct 01 12:40:36 crc kubenswrapper[4727]: I1001 12:40:36.350310 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nc8w8" Oct 01 12:40:36 crc kubenswrapper[4727]: I1001 12:40:36.350633 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kfxlj" Oct 01 12:40:36 crc kubenswrapper[4727]: I1001 12:40:36.472076 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr9tw\" (UniqueName: \"kubernetes.io/projected/e4d11880-82d5-49b0-965a-e2fc54b9c775-kube-api-access-qr9tw\") pod \"e4d11880-82d5-49b0-965a-e2fc54b9c775\" (UID: \"e4d11880-82d5-49b0-965a-e2fc54b9c775\") " Oct 01 12:40:36 crc kubenswrapper[4727]: I1001 12:40:36.472134 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4d11880-82d5-49b0-965a-e2fc54b9c775-utilities\") pod \"e4d11880-82d5-49b0-965a-e2fc54b9c775\" (UID: \"e4d11880-82d5-49b0-965a-e2fc54b9c775\") " Oct 01 12:40:36 crc kubenswrapper[4727]: I1001 12:40:36.472230 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4d11880-82d5-49b0-965a-e2fc54b9c775-catalog-content\") pod \"e4d11880-82d5-49b0-965a-e2fc54b9c775\" (UID: \"e4d11880-82d5-49b0-965a-e2fc54b9c775\") " Oct 01 12:40:36 crc kubenswrapper[4727]: I1001 12:40:36.473196 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4d11880-82d5-49b0-965a-e2fc54b9c775-utilities" (OuterVolumeSpecName: "utilities") pod "e4d11880-82d5-49b0-965a-e2fc54b9c775" (UID: "e4d11880-82d5-49b0-965a-e2fc54b9c775"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:40:36 crc kubenswrapper[4727]: I1001 12:40:36.473676 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4d11880-82d5-49b0-965a-e2fc54b9c775-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:40:36 crc kubenswrapper[4727]: I1001 12:40:36.479103 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4d11880-82d5-49b0-965a-e2fc54b9c775-kube-api-access-qr9tw" (OuterVolumeSpecName: "kube-api-access-qr9tw") pod "e4d11880-82d5-49b0-965a-e2fc54b9c775" (UID: "e4d11880-82d5-49b0-965a-e2fc54b9c775"). InnerVolumeSpecName "kube-api-access-qr9tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:40:36 crc kubenswrapper[4727]: I1001 12:40:36.529429 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4d11880-82d5-49b0-965a-e2fc54b9c775-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4d11880-82d5-49b0-965a-e2fc54b9c775" (UID: "e4d11880-82d5-49b0-965a-e2fc54b9c775"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:40:36 crc kubenswrapper[4727]: I1001 12:40:36.576698 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qr9tw\" (UniqueName: \"kubernetes.io/projected/e4d11880-82d5-49b0-965a-e2fc54b9c775-kube-api-access-qr9tw\") on node \"crc\" DevicePath \"\"" Oct 01 12:40:36 crc kubenswrapper[4727]: I1001 12:40:36.576759 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4d11880-82d5-49b0-965a-e2fc54b9c775-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:40:36 crc kubenswrapper[4727]: I1001 12:40:36.997670 4727 generic.go:334] "Generic (PLEG): container finished" podID="e4d11880-82d5-49b0-965a-e2fc54b9c775" containerID="7c31c0aa328a02a53a8bd144d36a7ed31d9d40ec2a2c84d342660b03f95ac0c8" exitCode=0 Oct 01 12:40:36 crc kubenswrapper[4727]: I1001 12:40:36.997789 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfxlj" event={"ID":"e4d11880-82d5-49b0-965a-e2fc54b9c775","Type":"ContainerDied","Data":"7c31c0aa328a02a53a8bd144d36a7ed31d9d40ec2a2c84d342660b03f95ac0c8"} Oct 01 12:40:36 crc kubenswrapper[4727]: I1001 12:40:36.997836 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfxlj" event={"ID":"e4d11880-82d5-49b0-965a-e2fc54b9c775","Type":"ContainerDied","Data":"1859e903cb4592d1825d9f77e936bfc24f58e0e2ffcc91f5dd0ffbaee3dbb12b"} Oct 01 12:40:36 crc kubenswrapper[4727]: I1001 12:40:36.997856 4727 scope.go:117] "RemoveContainer" containerID="7c31c0aa328a02a53a8bd144d36a7ed31d9d40ec2a2c84d342660b03f95ac0c8" Oct 01 12:40:36 crc kubenswrapper[4727]: I1001 12:40:36.998079 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kfxlj" Oct 01 12:40:37 crc kubenswrapper[4727]: I1001 12:40:37.018544 4727 scope.go:117] "RemoveContainer" containerID="87c5fa19491de0495d5f85a85309747ffa9631cef4240ce44a3a38422540685a" Oct 01 12:40:37 crc kubenswrapper[4727]: I1001 12:40:37.034186 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kfxlj"] Oct 01 12:40:37 crc kubenswrapper[4727]: I1001 12:40:37.037372 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kfxlj"] Oct 01 12:40:37 crc kubenswrapper[4727]: I1001 12:40:37.052795 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nc8w8" Oct 01 12:40:37 crc kubenswrapper[4727]: I1001 12:40:37.057950 4727 scope.go:117] "RemoveContainer" containerID="7589925fab3198d500a3ff17eb74351ef1044950b8c9039669e95794b3039a01" Oct 01 12:40:37 crc kubenswrapper[4727]: I1001 12:40:37.073311 4727 scope.go:117] "RemoveContainer" containerID="7c31c0aa328a02a53a8bd144d36a7ed31d9d40ec2a2c84d342660b03f95ac0c8" Oct 01 12:40:37 crc kubenswrapper[4727]: E1001 12:40:37.081776 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c31c0aa328a02a53a8bd144d36a7ed31d9d40ec2a2c84d342660b03f95ac0c8\": container with ID starting with 7c31c0aa328a02a53a8bd144d36a7ed31d9d40ec2a2c84d342660b03f95ac0c8 not found: ID does not exist" containerID="7c31c0aa328a02a53a8bd144d36a7ed31d9d40ec2a2c84d342660b03f95ac0c8" Oct 01 12:40:37 crc kubenswrapper[4727]: I1001 12:40:37.081837 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c31c0aa328a02a53a8bd144d36a7ed31d9d40ec2a2c84d342660b03f95ac0c8"} err="failed to get container status \"7c31c0aa328a02a53a8bd144d36a7ed31d9d40ec2a2c84d342660b03f95ac0c8\": rpc error: code = NotFound desc = could not find container \"7c31c0aa328a02a53a8bd144d36a7ed31d9d40ec2a2c84d342660b03f95ac0c8\": container with ID starting with 7c31c0aa328a02a53a8bd144d36a7ed31d9d40ec2a2c84d342660b03f95ac0c8 not found: ID does not exist" Oct 01 12:40:37 crc kubenswrapper[4727]: I1001 12:40:37.081874 4727 scope.go:117] "RemoveContainer" containerID="87c5fa19491de0495d5f85a85309747ffa9631cef4240ce44a3a38422540685a" Oct 01 12:40:37 crc kubenswrapper[4727]: E1001 12:40:37.082735 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87c5fa19491de0495d5f85a85309747ffa9631cef4240ce44a3a38422540685a\": container with ID starting with 87c5fa19491de0495d5f85a85309747ffa9631cef4240ce44a3a38422540685a not found: ID does not exist" containerID="87c5fa19491de0495d5f85a85309747ffa9631cef4240ce44a3a38422540685a" Oct 01 12:40:37 crc kubenswrapper[4727]: I1001 12:40:37.082839 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87c5fa19491de0495d5f85a85309747ffa9631cef4240ce44a3a38422540685a"} err="failed to get container status \"87c5fa19491de0495d5f85a85309747ffa9631cef4240ce44a3a38422540685a\": rpc error: code = NotFound desc = could not find container \"87c5fa19491de0495d5f85a85309747ffa9631cef4240ce44a3a38422540685a\": container with ID starting with 87c5fa19491de0495d5f85a85309747ffa9631cef4240ce44a3a38422540685a not found: ID does not exist" Oct 01 12:40:37 crc kubenswrapper[4727]: I1001 12:40:37.082889 4727 scope.go:117] "RemoveContainer" containerID="7589925fab3198d500a3ff17eb74351ef1044950b8c9039669e95794b3039a01" Oct 01 12:40:37 crc kubenswrapper[4727]: E1001 12:40:37.084464 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7589925fab3198d500a3ff17eb74351ef1044950b8c9039669e95794b3039a01\": container with ID starting with 7589925fab3198d500a3ff17eb74351ef1044950b8c9039669e95794b3039a01 not found: ID does not exist" containerID="7589925fab3198d500a3ff17eb74351ef1044950b8c9039669e95794b3039a01" Oct 01 12:40:37 crc kubenswrapper[4727]: I1001 12:40:37.084534 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7589925fab3198d500a3ff17eb74351ef1044950b8c9039669e95794b3039a01"} err="failed to get container status \"7589925fab3198d500a3ff17eb74351ef1044950b8c9039669e95794b3039a01\": rpc error: code = NotFound desc = could not find container \"7589925fab3198d500a3ff17eb74351ef1044950b8c9039669e95794b3039a01\": container with ID starting with 7589925fab3198d500a3ff17eb74351ef1044950b8c9039669e95794b3039a01 not found: ID does not exist" Oct 01 12:40:38 crc kubenswrapper[4727]: I1001 12:40:38.379896 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4d11880-82d5-49b0-965a-e2fc54b9c775" path="/var/lib/kubelet/pods/e4d11880-82d5-49b0-965a-e2fc54b9c775/volumes" Oct 01 12:40:38 crc kubenswrapper[4727]: I1001 12:40:38.631960 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2mhjs"] Oct 01 12:40:38 crc kubenswrapper[4727]: I1001 12:40:38.632278 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2mhjs" podUID="051e51bb-3387-4009-8c88-fd90d76af6e2" containerName="registry-server" containerID="cri-o://2c5e3c11b7a64660c2f720f2166210cdc66415ae1ba14a6a7888d9345a03863b" gracePeriod=2 Oct 01 12:40:38 crc kubenswrapper[4727]: I1001 12:40:38.969764 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2mhjs" Oct 01 12:40:39 crc kubenswrapper[4727]: I1001 12:40:39.010771 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/051e51bb-3387-4009-8c88-fd90d76af6e2-catalog-content\") pod \"051e51bb-3387-4009-8c88-fd90d76af6e2\" (UID: \"051e51bb-3387-4009-8c88-fd90d76af6e2\") " Oct 01 12:40:39 crc kubenswrapper[4727]: I1001 12:40:39.010822 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqxw5\" (UniqueName: \"kubernetes.io/projected/051e51bb-3387-4009-8c88-fd90d76af6e2-kube-api-access-jqxw5\") pod \"051e51bb-3387-4009-8c88-fd90d76af6e2\" (UID: \"051e51bb-3387-4009-8c88-fd90d76af6e2\") " Oct 01 12:40:39 crc kubenswrapper[4727]: I1001 12:40:39.010928 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/051e51bb-3387-4009-8c88-fd90d76af6e2-utilities\") pod \"051e51bb-3387-4009-8c88-fd90d76af6e2\" (UID: \"051e51bb-3387-4009-8c88-fd90d76af6e2\") " Oct 01 12:40:39 crc kubenswrapper[4727]: I1001 12:40:39.012085 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/051e51bb-3387-4009-8c88-fd90d76af6e2-utilities" (OuterVolumeSpecName: "utilities") pod "051e51bb-3387-4009-8c88-fd90d76af6e2" (UID: "051e51bb-3387-4009-8c88-fd90d76af6e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:40:39 crc kubenswrapper[4727]: I1001 12:40:39.029336 4727 generic.go:334] "Generic (PLEG): container finished" podID="051e51bb-3387-4009-8c88-fd90d76af6e2" containerID="2c5e3c11b7a64660c2f720f2166210cdc66415ae1ba14a6a7888d9345a03863b" exitCode=0 Oct 01 12:40:39 crc kubenswrapper[4727]: I1001 12:40:39.029383 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2mhjs" event={"ID":"051e51bb-3387-4009-8c88-fd90d76af6e2","Type":"ContainerDied","Data":"2c5e3c11b7a64660c2f720f2166210cdc66415ae1ba14a6a7888d9345a03863b"} Oct 01 12:40:39 crc kubenswrapper[4727]: I1001 12:40:39.029394 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/051e51bb-3387-4009-8c88-fd90d76af6e2-kube-api-access-jqxw5" (OuterVolumeSpecName: "kube-api-access-jqxw5") pod "051e51bb-3387-4009-8c88-fd90d76af6e2" (UID: "051e51bb-3387-4009-8c88-fd90d76af6e2"). InnerVolumeSpecName "kube-api-access-jqxw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:40:39 crc kubenswrapper[4727]: I1001 12:40:39.029413 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2mhjs" event={"ID":"051e51bb-3387-4009-8c88-fd90d76af6e2","Type":"ContainerDied","Data":"7e52048bf40d26be5c996f699f1b883526d7740622d4513c670c5dd0532b0bb0"} Oct 01 12:40:39 crc kubenswrapper[4727]: I1001 12:40:39.029430 4727 scope.go:117] "RemoveContainer" containerID="2c5e3c11b7a64660c2f720f2166210cdc66415ae1ba14a6a7888d9345a03863b" Oct 01 12:40:39 crc kubenswrapper[4727]: I1001 12:40:39.029511 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2mhjs" Oct 01 12:40:39 crc kubenswrapper[4727]: I1001 12:40:39.030426 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/051e51bb-3387-4009-8c88-fd90d76af6e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "051e51bb-3387-4009-8c88-fd90d76af6e2" (UID: "051e51bb-3387-4009-8c88-fd90d76af6e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:40:39 crc kubenswrapper[4727]: I1001 12:40:39.062222 4727 scope.go:117] "RemoveContainer" containerID="3752d1dbf7c67719908ee77ea76add2765afb018530bf3dca3e0c22a7649a460" Oct 01 12:40:39 crc kubenswrapper[4727]: I1001 12:40:39.077798 4727 scope.go:117] "RemoveContainer" containerID="c938a05c6aa8f25db25adcae2cada7c498426b74e542e0a11b95a7af7434f8e5" Oct 01 12:40:39 crc kubenswrapper[4727]: I1001 12:40:39.105335 4727 scope.go:117] "RemoveContainer" containerID="2c5e3c11b7a64660c2f720f2166210cdc66415ae1ba14a6a7888d9345a03863b" Oct 01 12:40:39 crc kubenswrapper[4727]: E1001 12:40:39.105699 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c5e3c11b7a64660c2f720f2166210cdc66415ae1ba14a6a7888d9345a03863b\": container with ID starting with 2c5e3c11b7a64660c2f720f2166210cdc66415ae1ba14a6a7888d9345a03863b not found: ID does not exist" containerID="2c5e3c11b7a64660c2f720f2166210cdc66415ae1ba14a6a7888d9345a03863b" Oct 01 12:40:39 crc kubenswrapper[4727]: I1001 12:40:39.105735 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c5e3c11b7a64660c2f720f2166210cdc66415ae1ba14a6a7888d9345a03863b"} err="failed to get container status \"2c5e3c11b7a64660c2f720f2166210cdc66415ae1ba14a6a7888d9345a03863b\": rpc error: code = NotFound desc = could not find container \"2c5e3c11b7a64660c2f720f2166210cdc66415ae1ba14a6a7888d9345a03863b\": container with ID starting with 2c5e3c11b7a64660c2f720f2166210cdc66415ae1ba14a6a7888d9345a03863b not found: ID does not exist" Oct 01 12:40:39 crc kubenswrapper[4727]: I1001 12:40:39.105761 4727 scope.go:117] "RemoveContainer" containerID="3752d1dbf7c67719908ee77ea76add2765afb018530bf3dca3e0c22a7649a460" Oct 01 12:40:39 crc kubenswrapper[4727]: E1001 12:40:39.106046 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3752d1dbf7c67719908ee77ea76add2765afb018530bf3dca3e0c22a7649a460\": container with ID starting with 3752d1dbf7c67719908ee77ea76add2765afb018530bf3dca3e0c22a7649a460 not found: ID does not exist" containerID="3752d1dbf7c67719908ee77ea76add2765afb018530bf3dca3e0c22a7649a460" Oct 01 12:40:39 crc kubenswrapper[4727]: I1001 12:40:39.106076 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3752d1dbf7c67719908ee77ea76add2765afb018530bf3dca3e0c22a7649a460"} err="failed to get container status \"3752d1dbf7c67719908ee77ea76add2765afb018530bf3dca3e0c22a7649a460\": rpc error: code = NotFound desc = could not find container \"3752d1dbf7c67719908ee77ea76add2765afb018530bf3dca3e0c22a7649a460\": container with ID starting with 3752d1dbf7c67719908ee77ea76add2765afb018530bf3dca3e0c22a7649a460 not found: ID does not exist" Oct 01 12:40:39 crc kubenswrapper[4727]: I1001 12:40:39.106098 4727 scope.go:117] "RemoveContainer" containerID="c938a05c6aa8f25db25adcae2cada7c498426b74e542e0a11b95a7af7434f8e5" Oct 01 12:40:39 crc kubenswrapper[4727]: E1001 12:40:39.106320 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c938a05c6aa8f25db25adcae2cada7c498426b74e542e0a11b95a7af7434f8e5\": container with ID starting with c938a05c6aa8f25db25adcae2cada7c498426b74e542e0a11b95a7af7434f8e5 not found: ID does not exist" containerID="c938a05c6aa8f25db25adcae2cada7c498426b74e542e0a11b95a7af7434f8e5" Oct 01 12:40:39 crc kubenswrapper[4727]: I1001 12:40:39.106340 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c938a05c6aa8f25db25adcae2cada7c498426b74e542e0a11b95a7af7434f8e5"} err="failed to get container status \"c938a05c6aa8f25db25adcae2cada7c498426b74e542e0a11b95a7af7434f8e5\": rpc error: code = NotFound desc = could not find container \"c938a05c6aa8f25db25adcae2cada7c498426b74e542e0a11b95a7af7434f8e5\": container with ID starting with c938a05c6aa8f25db25adcae2cada7c498426b74e542e0a11b95a7af7434f8e5 not found: ID does not exist" Oct 01 12:40:39 crc kubenswrapper[4727]: I1001 12:40:39.112098 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/051e51bb-3387-4009-8c88-fd90d76af6e2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:40:39 crc kubenswrapper[4727]: I1001 12:40:39.112132 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqxw5\" (UniqueName: \"kubernetes.io/projected/051e51bb-3387-4009-8c88-fd90d76af6e2-kube-api-access-jqxw5\") on node \"crc\" DevicePath \"\"" Oct 01 12:40:39 crc kubenswrapper[4727]: I1001 12:40:39.112146 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/051e51bb-3387-4009-8c88-fd90d76af6e2-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:40:39 crc kubenswrapper[4727]: I1001 12:40:39.354669 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2mhjs"] Oct 01 12:40:39 crc kubenswrapper[4727]: I1001 12:40:39.357131 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2mhjs"] Oct 01 12:40:40 crc kubenswrapper[4727]: I1001 12:40:40.379802 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="051e51bb-3387-4009-8c88-fd90d76af6e2" path="/var/lib/kubelet/pods/051e51bb-3387-4009-8c88-fd90d76af6e2/volumes" Oct 01 12:40:43 crc kubenswrapper[4727]: I1001 12:40:43.197622 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8l85n" Oct 01 12:40:43 crc kubenswrapper[4727]: I1001 12:40:43.256463 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8l85n" Oct 01 12:40:43 crc kubenswrapper[4727]: I1001 12:40:43.294138 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hj5p4" Oct 01 12:40:43 crc kubenswrapper[4727]: I1001 12:40:43.343431 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hj5p4" Oct 01 12:40:43 crc kubenswrapper[4727]: I1001 12:40:43.794351 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lrd89" Oct 01 12:40:43 crc kubenswrapper[4727]: I1001 12:40:43.850650 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lrd89" Oct 01 12:40:44 crc kubenswrapper[4727]: I1001 12:40:44.093585 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dwszm"] Oct 01 12:40:45 crc kubenswrapper[4727]: I1001 12:40:45.038470 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lrd89"] Oct 01 12:40:45 crc kubenswrapper[4727]: I1001 12:40:45.066553 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lrd89" podUID="c3beff22-e67e-4639-9562-3663809167d7" containerName="registry-server" containerID="cri-o://4423507370b9f15f1fd4ce2f150ed89690f35d0791769d43493d9e2ac22a7f2e" gracePeriod=2 Oct 01 12:40:51 crc kubenswrapper[4727]: I1001 12:40:51.101567 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lrd89_c3beff22-e67e-4639-9562-3663809167d7/registry-server/0.log" Oct 01 12:40:51 crc kubenswrapper[4727]: I1001 12:40:51.104008 4727 generic.go:334] "Generic (PLEG): container finished" podID="c3beff22-e67e-4639-9562-3663809167d7" containerID="4423507370b9f15f1fd4ce2f150ed89690f35d0791769d43493d9e2ac22a7f2e" exitCode=137 Oct 01 12:40:51 crc kubenswrapper[4727]: I1001 12:40:51.104087 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrd89" event={"ID":"c3beff22-e67e-4639-9562-3663809167d7","Type":"ContainerDied","Data":"4423507370b9f15f1fd4ce2f150ed89690f35d0791769d43493d9e2ac22a7f2e"} Oct 01 12:40:53 crc kubenswrapper[4727]: I1001 12:40:53.629761 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lrd89_c3beff22-e67e-4639-9562-3663809167d7/registry-server/0.log" Oct 01 12:40:53 crc kubenswrapper[4727]: I1001 12:40:53.631771 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrd89" Oct 01 12:40:53 crc kubenswrapper[4727]: I1001 12:40:53.721432 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkk2g\" (UniqueName: \"kubernetes.io/projected/c3beff22-e67e-4639-9562-3663809167d7-kube-api-access-fkk2g\") pod \"c3beff22-e67e-4639-9562-3663809167d7\" (UID: \"c3beff22-e67e-4639-9562-3663809167d7\") " Oct 01 12:40:53 crc kubenswrapper[4727]: I1001 12:40:53.721533 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3beff22-e67e-4639-9562-3663809167d7-utilities\") pod \"c3beff22-e67e-4639-9562-3663809167d7\" (UID: \"c3beff22-e67e-4639-9562-3663809167d7\") " Oct 01 12:40:53 crc kubenswrapper[4727]: I1001 12:40:53.721642 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3beff22-e67e-4639-9562-3663809167d7-catalog-content\") pod \"c3beff22-e67e-4639-9562-3663809167d7\" (UID: \"c3beff22-e67e-4639-9562-3663809167d7\") " Oct 01 12:40:53 crc kubenswrapper[4727]: I1001 12:40:53.722977 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3beff22-e67e-4639-9562-3663809167d7-utilities" (OuterVolumeSpecName: "utilities") pod "c3beff22-e67e-4639-9562-3663809167d7" (UID: "c3beff22-e67e-4639-9562-3663809167d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:40:53 crc kubenswrapper[4727]: I1001 12:40:53.730204 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3beff22-e67e-4639-9562-3663809167d7-kube-api-access-fkk2g" (OuterVolumeSpecName: "kube-api-access-fkk2g") pod "c3beff22-e67e-4639-9562-3663809167d7" (UID: "c3beff22-e67e-4639-9562-3663809167d7"). InnerVolumeSpecName "kube-api-access-fkk2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:40:53 crc kubenswrapper[4727]: I1001 12:40:53.822911 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkk2g\" (UniqueName: \"kubernetes.io/projected/c3beff22-e67e-4639-9562-3663809167d7-kube-api-access-fkk2g\") on node \"crc\" DevicePath \"\"" Oct 01 12:40:53 crc kubenswrapper[4727]: I1001 12:40:53.822954 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3beff22-e67e-4639-9562-3663809167d7-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:40:54 crc kubenswrapper[4727]: I1001 12:40:54.124092 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lrd89_c3beff22-e67e-4639-9562-3663809167d7/registry-server/0.log" Oct 01 12:40:54 crc kubenswrapper[4727]: I1001 12:40:54.124958 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrd89" event={"ID":"c3beff22-e67e-4639-9562-3663809167d7","Type":"ContainerDied","Data":"46cc483d66fd1c6f924b08a4e7f44f6697541002f26871353acc0f96df15c4f4"} Oct 01 12:40:54 crc kubenswrapper[4727]: I1001 12:40:54.125027 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrd89" Oct 01 12:40:54 crc kubenswrapper[4727]: I1001 12:40:54.125029 4727 scope.go:117] "RemoveContainer" containerID="4423507370b9f15f1fd4ce2f150ed89690f35d0791769d43493d9e2ac22a7f2e" Oct 01 12:40:54 crc kubenswrapper[4727]: I1001 12:40:54.141375 4727 scope.go:117] "RemoveContainer" containerID="328139b556e248f7536cae4f1d6cbf86c442cba2895246222d5b15eb7483c373" Oct 01 12:40:54 crc kubenswrapper[4727]: I1001 12:40:54.159328 4727 scope.go:117] "RemoveContainer" containerID="2013368ee9db53dd6a3f4b96626c4f5e57218941c5d10fbf3dc1c4ff55852f98" Oct 01 12:40:55 crc kubenswrapper[4727]: I1001 12:40:55.053517 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3beff22-e67e-4639-9562-3663809167d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3beff22-e67e-4639-9562-3663809167d7" (UID: "c3beff22-e67e-4639-9562-3663809167d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:40:55 crc kubenswrapper[4727]: I1001 12:40:55.141452 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3beff22-e67e-4639-9562-3663809167d7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:40:55 crc kubenswrapper[4727]: I1001 12:40:55.376372 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lrd89"] Oct 01 12:40:55 crc kubenswrapper[4727]: I1001 12:40:55.381969 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lrd89"] Oct 01 12:40:56 crc kubenswrapper[4727]: I1001 12:40:56.381219 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3beff22-e67e-4639-9562-3663809167d7" path="/var/lib/kubelet/pods/c3beff22-e67e-4639-9562-3663809167d7/volumes" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.126798 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" podUID="54454532-1909-4aa9-b17e-f244107b202e" containerName="oauth-openshift" containerID="cri-o://de1cf371611a3b346da44f6b17ea101fb8ef51d4fa0defab23a5dc923d80756b" gracePeriod=15 Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.277232 4727 generic.go:334] "Generic (PLEG): container finished" podID="54454532-1909-4aa9-b17e-f244107b202e" containerID="de1cf371611a3b346da44f6b17ea101fb8ef51d4fa0defab23a5dc923d80756b" exitCode=0 Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.277286 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" event={"ID":"54454532-1909-4aa9-b17e-f244107b202e","Type":"ContainerDied","Data":"de1cf371611a3b346da44f6b17ea101fb8ef51d4fa0defab23a5dc923d80756b"} Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.548180 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.598382 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7fb5d9b995-jhc52"] Oct 01 12:41:09 crc kubenswrapper[4727]: E1001 12:41:09.599397 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3beff22-e67e-4639-9562-3663809167d7" containerName="registry-server" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.599422 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3beff22-e67e-4639-9562-3663809167d7" containerName="registry-server" Oct 01 12:41:09 crc kubenswrapper[4727]: E1001 12:41:09.599444 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4d11880-82d5-49b0-965a-e2fc54b9c775" containerName="extract-content" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.599458 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4d11880-82d5-49b0-965a-e2fc54b9c775" containerName="extract-content" Oct 01 12:41:09 crc kubenswrapper[4727]: E1001 12:41:09.599473 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3beff22-e67e-4639-9562-3663809167d7" containerName="extract-utilities" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.599485 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3beff22-e67e-4639-9562-3663809167d7" containerName="extract-utilities" Oct 01 12:41:09 crc kubenswrapper[4727]: E1001 12:41:09.599501 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4d11880-82d5-49b0-965a-e2fc54b9c775" containerName="extract-utilities" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.599512 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4d11880-82d5-49b0-965a-e2fc54b9c775" containerName="extract-utilities" Oct 01 12:41:09 crc kubenswrapper[4727]: E1001 12:41:09.599527 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ce32e99-94ad-40ce-988d-febe69875dac" containerName="pruner" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.599541 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ce32e99-94ad-40ce-988d-febe69875dac" containerName="pruner" Oct 01 12:41:09 crc kubenswrapper[4727]: E1001 12:41:09.599555 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9005845-d890-439c-9b2d-2a80c0f61697" containerName="extract-content" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.599567 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9005845-d890-439c-9b2d-2a80c0f61697" containerName="extract-content" Oct 01 12:41:09 crc kubenswrapper[4727]: E1001 12:41:09.599589 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9005845-d890-439c-9b2d-2a80c0f61697" containerName="extract-utilities" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.599600 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9005845-d890-439c-9b2d-2a80c0f61697" containerName="extract-utilities" Oct 01 12:41:09 crc kubenswrapper[4727]: E1001 12:41:09.599617 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4d11880-82d5-49b0-965a-e2fc54b9c775" containerName="registry-server" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.599628 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4d11880-82d5-49b0-965a-e2fc54b9c775" containerName="registry-server" Oct 01 12:41:09 crc kubenswrapper[4727]: E1001 12:41:09.599642 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="051e51bb-3387-4009-8c88-fd90d76af6e2" containerName="extract-content" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.599654 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="051e51bb-3387-4009-8c88-fd90d76af6e2" containerName="extract-content" Oct 01 12:41:09 crc kubenswrapper[4727]: E1001 12:41:09.599667 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="051e51bb-3387-4009-8c88-fd90d76af6e2" containerName="registry-server" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.599681 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="051e51bb-3387-4009-8c88-fd90d76af6e2" containerName="registry-server" Oct 01 12:41:09 crc kubenswrapper[4727]: E1001 12:41:09.599709 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3beff22-e67e-4639-9562-3663809167d7" containerName="extract-content" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.599720 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3beff22-e67e-4639-9562-3663809167d7" containerName="extract-content" Oct 01 12:41:09 crc kubenswrapper[4727]: E1001 12:41:09.599734 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1974282f-c2f4-48cd-97e2-9e880203ef1c" containerName="collect-profiles" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.599744 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="1974282f-c2f4-48cd-97e2-9e880203ef1c" containerName="collect-profiles" Oct 01 12:41:09 crc kubenswrapper[4727]: E1001 12:41:09.599771 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="051e51bb-3387-4009-8c88-fd90d76af6e2" containerName="extract-utilities" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.599782 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="051e51bb-3387-4009-8c88-fd90d76af6e2" containerName="extract-utilities" Oct 01 12:41:09 crc kubenswrapper[4727]: E1001 12:41:09.599799 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9005845-d890-439c-9b2d-2a80c0f61697" containerName="registry-server" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.599811 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9005845-d890-439c-9b2d-2a80c0f61697" containerName="registry-server" Oct 01 12:41:09 crc kubenswrapper[4727]: E1001 12:41:09.599828 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54454532-1909-4aa9-b17e-f244107b202e" containerName="oauth-openshift" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.599840 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="54454532-1909-4aa9-b17e-f244107b202e" containerName="oauth-openshift" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.599991 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="1974282f-c2f4-48cd-97e2-9e880203ef1c" containerName="collect-profiles" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.600032 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ce32e99-94ad-40ce-988d-febe69875dac" containerName="pruner" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.600045 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3beff22-e67e-4639-9562-3663809167d7" containerName="registry-server" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.600067 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="54454532-1909-4aa9-b17e-f244107b202e" containerName="oauth-openshift" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.600078 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="051e51bb-3387-4009-8c88-fd90d76af6e2" containerName="registry-server" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.600092 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4d11880-82d5-49b0-965a-e2fc54b9c775" containerName="registry-server" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.600102 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9005845-d890-439c-9b2d-2a80c0f61697" containerName="registry-server" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.600721 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.608564 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-ocp-branding-template\") pod \"54454532-1909-4aa9-b17e-f244107b202e\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.608657 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/54454532-1909-4aa9-b17e-f244107b202e-audit-policies\") pod \"54454532-1909-4aa9-b17e-f244107b202e\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.608696 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s79pt\" (UniqueName: \"kubernetes.io/projected/54454532-1909-4aa9-b17e-f244107b202e-kube-api-access-s79pt\") pod \"54454532-1909-4aa9-b17e-f244107b202e\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.608749 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-user-template-provider-selection\") pod \"54454532-1909-4aa9-b17e-f244107b202e\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.608797 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-trusted-ca-bundle\") pod \"54454532-1909-4aa9-b17e-f244107b202e\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.608843 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/54454532-1909-4aa9-b17e-f244107b202e-audit-dir\") pod \"54454532-1909-4aa9-b17e-f244107b202e\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.608882 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-session\") pod \"54454532-1909-4aa9-b17e-f244107b202e\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.608924 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-user-template-login\") pod \"54454532-1909-4aa9-b17e-f244107b202e\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.609029 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-cliconfig\") pod \"54454532-1909-4aa9-b17e-f244107b202e\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.609062 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-serving-cert\") pod \"54454532-1909-4aa9-b17e-f244107b202e\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.609136 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-service-ca\") pod \"54454532-1909-4aa9-b17e-f244107b202e\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.609180 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-router-certs\") pod \"54454532-1909-4aa9-b17e-f244107b202e\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.609229 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-user-idp-0-file-data\") pod \"54454532-1909-4aa9-b17e-f244107b202e\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.609259 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-user-template-error\") pod \"54454532-1909-4aa9-b17e-f244107b202e\" (UID: \"54454532-1909-4aa9-b17e-f244107b202e\") " Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.613390 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "54454532-1909-4aa9-b17e-f244107b202e" (UID: "54454532-1909-4aa9-b17e-f244107b202e"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.613758 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "54454532-1909-4aa9-b17e-f244107b202e" (UID: "54454532-1909-4aa9-b17e-f244107b202e"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.616109 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54454532-1909-4aa9-b17e-f244107b202e-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "54454532-1909-4aa9-b17e-f244107b202e" (UID: "54454532-1909-4aa9-b17e-f244107b202e"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.616169 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "54454532-1909-4aa9-b17e-f244107b202e" (UID: "54454532-1909-4aa9-b17e-f244107b202e"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.616201 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/54454532-1909-4aa9-b17e-f244107b202e-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "54454532-1909-4aa9-b17e-f244107b202e" (UID: "54454532-1909-4aa9-b17e-f244107b202e"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.619347 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "54454532-1909-4aa9-b17e-f244107b202e" (UID: "54454532-1909-4aa9-b17e-f244107b202e"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.619633 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7fb5d9b995-jhc52"] Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.619970 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "54454532-1909-4aa9-b17e-f244107b202e" (UID: "54454532-1909-4aa9-b17e-f244107b202e"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.622446 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "54454532-1909-4aa9-b17e-f244107b202e" (UID: "54454532-1909-4aa9-b17e-f244107b202e"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.637497 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54454532-1909-4aa9-b17e-f244107b202e-kube-api-access-s79pt" (OuterVolumeSpecName: "kube-api-access-s79pt") pod "54454532-1909-4aa9-b17e-f244107b202e" (UID: "54454532-1909-4aa9-b17e-f244107b202e"). InnerVolumeSpecName "kube-api-access-s79pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.637566 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "54454532-1909-4aa9-b17e-f244107b202e" (UID: "54454532-1909-4aa9-b17e-f244107b202e"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.640434 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "54454532-1909-4aa9-b17e-f244107b202e" (UID: "54454532-1909-4aa9-b17e-f244107b202e"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.640730 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "54454532-1909-4aa9-b17e-f244107b202e" (UID: "54454532-1909-4aa9-b17e-f244107b202e"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.640901 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "54454532-1909-4aa9-b17e-f244107b202e" (UID: "54454532-1909-4aa9-b17e-f244107b202e"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.642330 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "54454532-1909-4aa9-b17e-f244107b202e" (UID: "54454532-1909-4aa9-b17e-f244107b202e"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.710507 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-system-router-certs\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.710571 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.710601 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.710694 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-user-template-error\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.710729 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.710749 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.710774 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b033f7ae-7f09-4740-808c-575d3731dd96-audit-policies\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.710800 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-system-session\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.711058 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-system-service-ca\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.711130 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.711198 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.711252 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b033f7ae-7f09-4740-808c-575d3731dd96-audit-dir\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.711278 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8qcp\" (UniqueName: \"kubernetes.io/projected/b033f7ae-7f09-4740-808c-575d3731dd96-kube-api-access-f8qcp\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.711305 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-user-template-login\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.711357 4727 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.711374 4727 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.711388 4727 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.711401 4727 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.711412 4727 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.711425 4727 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.711437 4727 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.711453 4727 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.711468 4727 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/54454532-1909-4aa9-b17e-f244107b202e-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.711481 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s79pt\" (UniqueName: \"kubernetes.io/projected/54454532-1909-4aa9-b17e-f244107b202e-kube-api-access-s79pt\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.711495 4727 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.711508 4727 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.711521 4727 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/54454532-1909-4aa9-b17e-f244107b202e-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.711532 4727 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/54454532-1909-4aa9-b17e-f244107b202e-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.812452 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-system-router-certs\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.812525 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.812567 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.812614 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-user-template-error\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.812672 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.812833 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.812875 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b033f7ae-7f09-4740-808c-575d3731dd96-audit-policies\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.812915 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-system-session\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.813461 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-system-service-ca\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.813532 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.813609 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.813664 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b033f7ae-7f09-4740-808c-575d3731dd96-audit-dir\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.813709 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8qcp\" (UniqueName: \"kubernetes.io/projected/b033f7ae-7f09-4740-808c-575d3731dd96-kube-api-access-f8qcp\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.813756 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-user-template-login\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.814536 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b033f7ae-7f09-4740-808c-575d3731dd96-audit-policies\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.814627 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b033f7ae-7f09-4740-808c-575d3731dd96-audit-dir\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.815041 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-system-service-ca\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.815084 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.814595 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.816524 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-user-template-error\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.816897 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-system-session\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.817341 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.817431 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.819197 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-system-router-certs\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.820391 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.820740 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.821351 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b033f7ae-7f09-4740-808c-575d3731dd96-v4-0-config-user-template-login\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.839817 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8qcp\" (UniqueName: \"kubernetes.io/projected/b033f7ae-7f09-4740-808c-575d3731dd96-kube-api-access-f8qcp\") pod \"oauth-openshift-7fb5d9b995-jhc52\" (UID: \"b033f7ae-7f09-4740-808c-575d3731dd96\") " pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:09 crc kubenswrapper[4727]: I1001 12:41:09.965640 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:10 crc kubenswrapper[4727]: I1001 12:41:10.208072 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7fb5d9b995-jhc52"] Oct 01 12:41:10 crc kubenswrapper[4727]: I1001 12:41:10.286131 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" Oct 01 12:41:10 crc kubenswrapper[4727]: I1001 12:41:10.286116 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-dwszm" event={"ID":"54454532-1909-4aa9-b17e-f244107b202e","Type":"ContainerDied","Data":"49b19a22b37f59b0b05e4bd54601c53febaec7bd95b8f8fbbfed49465a3d6506"} Oct 01 12:41:10 crc kubenswrapper[4727]: I1001 12:41:10.286313 4727 scope.go:117] "RemoveContainer" containerID="de1cf371611a3b346da44f6b17ea101fb8ef51d4fa0defab23a5dc923d80756b" Oct 01 12:41:10 crc kubenswrapper[4727]: I1001 12:41:10.293099 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" event={"ID":"b033f7ae-7f09-4740-808c-575d3731dd96","Type":"ContainerStarted","Data":"674c2f5dcdaf37c451975fcfee979447805a73bdb0aafb634ca63f1795fe64cf"} Oct 01 12:41:10 crc kubenswrapper[4727]: I1001 12:41:10.326353 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dwszm"] Oct 01 12:41:10 crc kubenswrapper[4727]: I1001 12:41:10.329746 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-dwszm"] Oct 01 12:41:10 crc kubenswrapper[4727]: I1001 12:41:10.378423 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54454532-1909-4aa9-b17e-f244107b202e" path="/var/lib/kubelet/pods/54454532-1909-4aa9-b17e-f244107b202e/volumes" Oct 01 12:41:11 crc kubenswrapper[4727]: I1001 12:41:11.302201 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" event={"ID":"b033f7ae-7f09-4740-808c-575d3731dd96","Type":"ContainerStarted","Data":"5d6402507770f164246d9ccc5909a91dacaabe09761d4817fcae6a4b65d57cd5"} Oct 01 12:41:11 crc kubenswrapper[4727]: I1001 12:41:11.302547 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:11 crc kubenswrapper[4727]: I1001 12:41:11.310597 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" Oct 01 12:41:11 crc kubenswrapper[4727]: I1001 12:41:11.340625 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7fb5d9b995-jhc52" podStartSLOduration=27.340588648 podStartE2EDuration="27.340588648s" podCreationTimestamp="2025-10-01 12:40:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:41:11.33594855 +0000 UTC m=+249.657303477" watchObservedRunningTime="2025-10-01 12:41:11.340588648 +0000 UTC m=+249.661943545" Oct 01 12:41:23 crc kubenswrapper[4727]: I1001 12:41:23.574649 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hj5p4"] Oct 01 12:41:23 crc kubenswrapper[4727]: I1001 12:41:23.575525 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hj5p4" podUID="c9a39814-1723-46e2-b468-67e6cf668788" containerName="registry-server" containerID="cri-o://9bb260d3282e05f323820fb3d0e0a921868d17b6516b38642c22d4bc25aede31" gracePeriod=30 Oct 01 12:41:23 crc kubenswrapper[4727]: I1001 12:41:23.578811 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8l85n"] Oct 01 12:41:23 crc kubenswrapper[4727]: I1001 12:41:23.579439 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8l85n" podUID="8548d350-ee32-44e6-85d2-2e30036d5eb8" containerName="registry-server" containerID="cri-o://3a20d53158f6fc563f553367162de3e24f0261c83d1f7cb33453c517997897ef" gracePeriod=30 Oct 01 12:41:23 crc kubenswrapper[4727]: I1001 12:41:23.604858 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sz95m"] Oct 01 12:41:23 crc kubenswrapper[4727]: I1001 12:41:23.605139 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-sz95m" podUID="ea389964-1da2-4ade-8772-b8bd1a76cc27" containerName="marketplace-operator" containerID="cri-o://a625a79f3dfd18091559a9fc965ba1970c25ed879e4c7a021028f4be86bdd5e2" gracePeriod=30 Oct 01 12:41:23 crc kubenswrapper[4727]: I1001 12:41:23.607653 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kws6b"] Oct 01 12:41:23 crc kubenswrapper[4727]: I1001 12:41:23.607945 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kws6b" podUID="defd3d6f-dd53-4725-af25-c711790c4870" containerName="registry-server" containerID="cri-o://18bd77756476c61358c6bdf394d0772e799ba9394234aadc913ca40d51514782" gracePeriod=30 Oct 01 12:41:23 crc kubenswrapper[4727]: I1001 12:41:23.617812 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8jhtf"] Oct 01 12:41:23 crc kubenswrapper[4727]: I1001 12:41:23.620517 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8jhtf" Oct 01 12:41:23 crc kubenswrapper[4727]: I1001 12:41:23.625118 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nc8w8"] Oct 01 12:41:23 crc kubenswrapper[4727]: I1001 12:41:23.625402 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nc8w8" podUID="d1b0fc07-8033-4220-a491-cc668e795d10" containerName="registry-server" containerID="cri-o://67bd348f31ceed3cf09214d5e8eed4695980248ab7db8d27627ffcb7df579f9b" gracePeriod=30 Oct 01 12:41:23 crc kubenswrapper[4727]: I1001 12:41:23.638674 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8jhtf"] Oct 01 12:41:23 crc kubenswrapper[4727]: I1001 12:41:23.697486 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfnpj\" (UniqueName: \"kubernetes.io/projected/f2bd8192-96d6-40cd-877f-3a288140a8e9-kube-api-access-sfnpj\") pod \"marketplace-operator-79b997595-8jhtf\" (UID: \"f2bd8192-96d6-40cd-877f-3a288140a8e9\") " pod="openshift-marketplace/marketplace-operator-79b997595-8jhtf" Oct 01 12:41:23 crc kubenswrapper[4727]: I1001 12:41:23.697562 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f2bd8192-96d6-40cd-877f-3a288140a8e9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8jhtf\" (UID: \"f2bd8192-96d6-40cd-877f-3a288140a8e9\") " pod="openshift-marketplace/marketplace-operator-79b997595-8jhtf" Oct 01 12:41:23 crc kubenswrapper[4727]: I1001 12:41:23.697680 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2bd8192-96d6-40cd-877f-3a288140a8e9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8jhtf\" (UID: \"f2bd8192-96d6-40cd-877f-3a288140a8e9\") " pod="openshift-marketplace/marketplace-operator-79b997595-8jhtf" Oct 01 12:41:23 crc kubenswrapper[4727]: I1001 12:41:23.798779 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f2bd8192-96d6-40cd-877f-3a288140a8e9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8jhtf\" (UID: \"f2bd8192-96d6-40cd-877f-3a288140a8e9\") " pod="openshift-marketplace/marketplace-operator-79b997595-8jhtf" Oct 01 12:41:23 crc kubenswrapper[4727]: I1001 12:41:23.798874 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2bd8192-96d6-40cd-877f-3a288140a8e9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8jhtf\" (UID: \"f2bd8192-96d6-40cd-877f-3a288140a8e9\") " pod="openshift-marketplace/marketplace-operator-79b997595-8jhtf" Oct 01 12:41:23 crc kubenswrapper[4727]: I1001 12:41:23.798927 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfnpj\" (UniqueName: \"kubernetes.io/projected/f2bd8192-96d6-40cd-877f-3a288140a8e9-kube-api-access-sfnpj\") pod \"marketplace-operator-79b997595-8jhtf\" (UID: \"f2bd8192-96d6-40cd-877f-3a288140a8e9\") " pod="openshift-marketplace/marketplace-operator-79b997595-8jhtf" Oct 01 12:41:23 crc kubenswrapper[4727]: I1001 12:41:23.801216 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2bd8192-96d6-40cd-877f-3a288140a8e9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8jhtf\" (UID: \"f2bd8192-96d6-40cd-877f-3a288140a8e9\") " pod="openshift-marketplace/marketplace-operator-79b997595-8jhtf" Oct 01 12:41:23 crc kubenswrapper[4727]: I1001 12:41:23.805652 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f2bd8192-96d6-40cd-877f-3a288140a8e9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8jhtf\" (UID: \"f2bd8192-96d6-40cd-877f-3a288140a8e9\") " pod="openshift-marketplace/marketplace-operator-79b997595-8jhtf" Oct 01 12:41:23 crc kubenswrapper[4727]: I1001 12:41:23.819420 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfnpj\" (UniqueName: \"kubernetes.io/projected/f2bd8192-96d6-40cd-877f-3a288140a8e9-kube-api-access-sfnpj\") pod \"marketplace-operator-79b997595-8jhtf\" (UID: \"f2bd8192-96d6-40cd-877f-3a288140a8e9\") " pod="openshift-marketplace/marketplace-operator-79b997595-8jhtf" Oct 01 12:41:23 crc kubenswrapper[4727]: I1001 12:41:23.950385 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8jhtf" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.062702 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hj5p4" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.068234 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sz95m" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.075990 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kws6b" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.090136 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc8w8" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.101709 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a39814-1723-46e2-b468-67e6cf668788-utilities\") pod \"c9a39814-1723-46e2-b468-67e6cf668788\" (UID: \"c9a39814-1723-46e2-b468-67e6cf668788\") " Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.101755 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4h42\" (UniqueName: \"kubernetes.io/projected/c9a39814-1723-46e2-b468-67e6cf668788-kube-api-access-r4h42\") pod \"c9a39814-1723-46e2-b468-67e6cf668788\" (UID: \"c9a39814-1723-46e2-b468-67e6cf668788\") " Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.101813 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a39814-1723-46e2-b468-67e6cf668788-catalog-content\") pod \"c9a39814-1723-46e2-b468-67e6cf668788\" (UID: \"c9a39814-1723-46e2-b468-67e6cf668788\") " Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.104391 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9a39814-1723-46e2-b468-67e6cf668788-utilities" (OuterVolumeSpecName: "utilities") pod "c9a39814-1723-46e2-b468-67e6cf668788" (UID: "c9a39814-1723-46e2-b468-67e6cf668788"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.105769 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9a39814-1723-46e2-b468-67e6cf668788-kube-api-access-r4h42" (OuterVolumeSpecName: "kube-api-access-r4h42") pod "c9a39814-1723-46e2-b468-67e6cf668788" (UID: "c9a39814-1723-46e2-b468-67e6cf668788"). InnerVolumeSpecName "kube-api-access-r4h42". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.166792 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9a39814-1723-46e2-b468-67e6cf668788-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9a39814-1723-46e2-b468-67e6cf668788" (UID: "c9a39814-1723-46e2-b468-67e6cf668788"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.203553 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1b0fc07-8033-4220-a491-cc668e795d10-utilities\") pod \"d1b0fc07-8033-4220-a491-cc668e795d10\" (UID: \"d1b0fc07-8033-4220-a491-cc668e795d10\") " Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.203629 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ea389964-1da2-4ade-8772-b8bd1a76cc27-marketplace-operator-metrics\") pod \"ea389964-1da2-4ade-8772-b8bd1a76cc27\" (UID: \"ea389964-1da2-4ade-8772-b8bd1a76cc27\") " Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.203683 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvr5b\" (UniqueName: \"kubernetes.io/projected/d1b0fc07-8033-4220-a491-cc668e795d10-kube-api-access-rvr5b\") pod \"d1b0fc07-8033-4220-a491-cc668e795d10\" (UID: \"d1b0fc07-8033-4220-a491-cc668e795d10\") " Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.203721 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ftsr\" (UniqueName: \"kubernetes.io/projected/ea389964-1da2-4ade-8772-b8bd1a76cc27-kube-api-access-2ftsr\") pod \"ea389964-1da2-4ade-8772-b8bd1a76cc27\" (UID: \"ea389964-1da2-4ade-8772-b8bd1a76cc27\") " Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.203747 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1b0fc07-8033-4220-a491-cc668e795d10-catalog-content\") pod \"d1b0fc07-8033-4220-a491-cc668e795d10\" (UID: \"d1b0fc07-8033-4220-a491-cc668e795d10\") " Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.203782 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/defd3d6f-dd53-4725-af25-c711790c4870-utilities\") pod \"defd3d6f-dd53-4725-af25-c711790c4870\" (UID: \"defd3d6f-dd53-4725-af25-c711790c4870\") " Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.203824 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnnmb\" (UniqueName: \"kubernetes.io/projected/defd3d6f-dd53-4725-af25-c711790c4870-kube-api-access-xnnmb\") pod \"defd3d6f-dd53-4725-af25-c711790c4870\" (UID: \"defd3d6f-dd53-4725-af25-c711790c4870\") " Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.203871 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea389964-1da2-4ade-8772-b8bd1a76cc27-marketplace-trusted-ca\") pod \"ea389964-1da2-4ade-8772-b8bd1a76cc27\" (UID: \"ea389964-1da2-4ade-8772-b8bd1a76cc27\") " Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.203912 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/defd3d6f-dd53-4725-af25-c711790c4870-catalog-content\") pod \"defd3d6f-dd53-4725-af25-c711790c4870\" (UID: \"defd3d6f-dd53-4725-af25-c711790c4870\") " Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.204165 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a39814-1723-46e2-b468-67e6cf668788-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.204183 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4h42\" (UniqueName: \"kubernetes.io/projected/c9a39814-1723-46e2-b468-67e6cf668788-kube-api-access-r4h42\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.204196 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a39814-1723-46e2-b468-67e6cf668788-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.204266 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1b0fc07-8033-4220-a491-cc668e795d10-utilities" (OuterVolumeSpecName: "utilities") pod "d1b0fc07-8033-4220-a491-cc668e795d10" (UID: "d1b0fc07-8033-4220-a491-cc668e795d10"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.204828 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/defd3d6f-dd53-4725-af25-c711790c4870-utilities" (OuterVolumeSpecName: "utilities") pod "defd3d6f-dd53-4725-af25-c711790c4870" (UID: "defd3d6f-dd53-4725-af25-c711790c4870"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.206308 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1b0fc07-8033-4220-a491-cc668e795d10-kube-api-access-rvr5b" (OuterVolumeSpecName: "kube-api-access-rvr5b") pod "d1b0fc07-8033-4220-a491-cc668e795d10" (UID: "d1b0fc07-8033-4220-a491-cc668e795d10"). InnerVolumeSpecName "kube-api-access-rvr5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.206931 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea389964-1da2-4ade-8772-b8bd1a76cc27-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "ea389964-1da2-4ade-8772-b8bd1a76cc27" (UID: "ea389964-1da2-4ade-8772-b8bd1a76cc27"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.207659 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea389964-1da2-4ade-8772-b8bd1a76cc27-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "ea389964-1da2-4ade-8772-b8bd1a76cc27" (UID: "ea389964-1da2-4ade-8772-b8bd1a76cc27"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.207893 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea389964-1da2-4ade-8772-b8bd1a76cc27-kube-api-access-2ftsr" (OuterVolumeSpecName: "kube-api-access-2ftsr") pod "ea389964-1da2-4ade-8772-b8bd1a76cc27" (UID: "ea389964-1da2-4ade-8772-b8bd1a76cc27"). InnerVolumeSpecName "kube-api-access-2ftsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.209469 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/defd3d6f-dd53-4725-af25-c711790c4870-kube-api-access-xnnmb" (OuterVolumeSpecName: "kube-api-access-xnnmb") pod "defd3d6f-dd53-4725-af25-c711790c4870" (UID: "defd3d6f-dd53-4725-af25-c711790c4870"). InnerVolumeSpecName "kube-api-access-xnnmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.221781 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/defd3d6f-dd53-4725-af25-c711790c4870-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "defd3d6f-dd53-4725-af25-c711790c4870" (UID: "defd3d6f-dd53-4725-af25-c711790c4870"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.305636 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/defd3d6f-dd53-4725-af25-c711790c4870-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.305681 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnnmb\" (UniqueName: \"kubernetes.io/projected/defd3d6f-dd53-4725-af25-c711790c4870-kube-api-access-xnnmb\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.305697 4727 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea389964-1da2-4ade-8772-b8bd1a76cc27-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.305709 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/defd3d6f-dd53-4725-af25-c711790c4870-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.305721 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1b0fc07-8033-4220-a491-cc668e795d10-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.305733 4727 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ea389964-1da2-4ade-8772-b8bd1a76cc27-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.305746 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvr5b\" (UniqueName: \"kubernetes.io/projected/d1b0fc07-8033-4220-a491-cc668e795d10-kube-api-access-rvr5b\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.305759 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ftsr\" (UniqueName: \"kubernetes.io/projected/ea389964-1da2-4ade-8772-b8bd1a76cc27-kube-api-access-2ftsr\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.312134 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1b0fc07-8033-4220-a491-cc668e795d10-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d1b0fc07-8033-4220-a491-cc668e795d10" (UID: "d1b0fc07-8033-4220-a491-cc668e795d10"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.386732 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8jhtf"] Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.390346 4727 generic.go:334] "Generic (PLEG): container finished" podID="8548d350-ee32-44e6-85d2-2e30036d5eb8" containerID="3a20d53158f6fc563f553367162de3e24f0261c83d1f7cb33453c517997897ef" exitCode=0 Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.390436 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8l85n" event={"ID":"8548d350-ee32-44e6-85d2-2e30036d5eb8","Type":"ContainerDied","Data":"3a20d53158f6fc563f553367162de3e24f0261c83d1f7cb33453c517997897ef"} Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.393095 4727 generic.go:334] "Generic (PLEG): container finished" podID="defd3d6f-dd53-4725-af25-c711790c4870" containerID="18bd77756476c61358c6bdf394d0772e799ba9394234aadc913ca40d51514782" exitCode=0 Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.393167 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kws6b" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.393239 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kws6b" event={"ID":"defd3d6f-dd53-4725-af25-c711790c4870","Type":"ContainerDied","Data":"18bd77756476c61358c6bdf394d0772e799ba9394234aadc913ca40d51514782"} Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.393267 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kws6b" event={"ID":"defd3d6f-dd53-4725-af25-c711790c4870","Type":"ContainerDied","Data":"f179553b92247d17a97d63db8e2672690e4634405ea59032af36e7b5575a061c"} Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.393289 4727 scope.go:117] "RemoveContainer" containerID="18bd77756476c61358c6bdf394d0772e799ba9394234aadc913ca40d51514782" Oct 01 12:41:24 crc kubenswrapper[4727]: W1001 12:41:24.403742 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2bd8192_96d6_40cd_877f_3a288140a8e9.slice/crio-59a99c51c37cc0278d6abcd01a85783cee90816905815a7fd94cd38c3b7b706f WatchSource:0}: Error finding container 59a99c51c37cc0278d6abcd01a85783cee90816905815a7fd94cd38c3b7b706f: Status 404 returned error can't find the container with id 59a99c51c37cc0278d6abcd01a85783cee90816905815a7fd94cd38c3b7b706f Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.404884 4727 generic.go:334] "Generic (PLEG): container finished" podID="d1b0fc07-8033-4220-a491-cc668e795d10" containerID="67bd348f31ceed3cf09214d5e8eed4695980248ab7db8d27627ffcb7df579f9b" exitCode=0 Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.405240 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc8w8" event={"ID":"d1b0fc07-8033-4220-a491-cc668e795d10","Type":"ContainerDied","Data":"67bd348f31ceed3cf09214d5e8eed4695980248ab7db8d27627ffcb7df579f9b"} Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.405324 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc8w8" event={"ID":"d1b0fc07-8033-4220-a491-cc668e795d10","Type":"ContainerDied","Data":"c104fdd951a10df6504eeae8ca9d6fa0552f05afe1e9a6b646e2ad2eaa0a43c6"} Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.405473 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc8w8" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.406682 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1b0fc07-8033-4220-a491-cc668e795d10-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.410335 4727 generic.go:334] "Generic (PLEG): container finished" podID="ea389964-1da2-4ade-8772-b8bd1a76cc27" containerID="a625a79f3dfd18091559a9fc965ba1970c25ed879e4c7a021028f4be86bdd5e2" exitCode=0 Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.410475 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sz95m" event={"ID":"ea389964-1da2-4ade-8772-b8bd1a76cc27","Type":"ContainerDied","Data":"a625a79f3dfd18091559a9fc965ba1970c25ed879e4c7a021028f4be86bdd5e2"} Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.410537 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sz95m" event={"ID":"ea389964-1da2-4ade-8772-b8bd1a76cc27","Type":"ContainerDied","Data":"0c0c4134a8e85925c85f0b66719ab66af848303596747f568d95b33fa90f7b40"} Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.410614 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sz95m" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.419222 4727 generic.go:334] "Generic (PLEG): container finished" podID="c9a39814-1723-46e2-b468-67e6cf668788" containerID="9bb260d3282e05f323820fb3d0e0a921868d17b6516b38642c22d4bc25aede31" exitCode=0 Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.419390 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hj5p4" event={"ID":"c9a39814-1723-46e2-b468-67e6cf668788","Type":"ContainerDied","Data":"9bb260d3282e05f323820fb3d0e0a921868d17b6516b38642c22d4bc25aede31"} Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.419420 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hj5p4" event={"ID":"c9a39814-1723-46e2-b468-67e6cf668788","Type":"ContainerDied","Data":"ada921ffbf3889ad27d9dd2e578a5944c8216ef297af7cbfcaad23a18979837b"} Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.419450 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hj5p4" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.424157 4727 scope.go:117] "RemoveContainer" containerID="7ba087df4c3d5b49eea3df564075bfe4a20840ba5e5f322cc9646e9485e1e7d4" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.450657 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8l85n" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.463298 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kws6b"] Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.465477 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kws6b"] Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.474297 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nc8w8"] Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.478204 4727 scope.go:117] "RemoveContainer" containerID="4c7269cb9d1c4764fb2d999e192b9cf5c53aa5b732d2b6149e67229b8aea5ed4" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.478694 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nc8w8"] Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.500815 4727 scope.go:117] "RemoveContainer" containerID="18bd77756476c61358c6bdf394d0772e799ba9394234aadc913ca40d51514782" Oct 01 12:41:24 crc kubenswrapper[4727]: E1001 12:41:24.501895 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18bd77756476c61358c6bdf394d0772e799ba9394234aadc913ca40d51514782\": container with ID starting with 18bd77756476c61358c6bdf394d0772e799ba9394234aadc913ca40d51514782 not found: ID does not exist" containerID="18bd77756476c61358c6bdf394d0772e799ba9394234aadc913ca40d51514782" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.501941 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18bd77756476c61358c6bdf394d0772e799ba9394234aadc913ca40d51514782"} err="failed to get container status \"18bd77756476c61358c6bdf394d0772e799ba9394234aadc913ca40d51514782\": rpc error: code = NotFound desc = could not find container \"18bd77756476c61358c6bdf394d0772e799ba9394234aadc913ca40d51514782\": container with ID starting with 18bd77756476c61358c6bdf394d0772e799ba9394234aadc913ca40d51514782 not found: ID does not exist" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.501969 4727 scope.go:117] "RemoveContainer" containerID="7ba087df4c3d5b49eea3df564075bfe4a20840ba5e5f322cc9646e9485e1e7d4" Oct 01 12:41:24 crc kubenswrapper[4727]: E1001 12:41:24.503452 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ba087df4c3d5b49eea3df564075bfe4a20840ba5e5f322cc9646e9485e1e7d4\": container with ID starting with 7ba087df4c3d5b49eea3df564075bfe4a20840ba5e5f322cc9646e9485e1e7d4 not found: ID does not exist" containerID="7ba087df4c3d5b49eea3df564075bfe4a20840ba5e5f322cc9646e9485e1e7d4" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.503818 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ba087df4c3d5b49eea3df564075bfe4a20840ba5e5f322cc9646e9485e1e7d4"} err="failed to get container status \"7ba087df4c3d5b49eea3df564075bfe4a20840ba5e5f322cc9646e9485e1e7d4\": rpc error: code = NotFound desc = could not find container \"7ba087df4c3d5b49eea3df564075bfe4a20840ba5e5f322cc9646e9485e1e7d4\": container with ID starting with 7ba087df4c3d5b49eea3df564075bfe4a20840ba5e5f322cc9646e9485e1e7d4 not found: ID does not exist" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.503985 4727 scope.go:117] "RemoveContainer" containerID="4c7269cb9d1c4764fb2d999e192b9cf5c53aa5b732d2b6149e67229b8aea5ed4" Oct 01 12:41:24 crc kubenswrapper[4727]: E1001 12:41:24.504496 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c7269cb9d1c4764fb2d999e192b9cf5c53aa5b732d2b6149e67229b8aea5ed4\": container with ID starting with 4c7269cb9d1c4764fb2d999e192b9cf5c53aa5b732d2b6149e67229b8aea5ed4 not found: ID does not exist" containerID="4c7269cb9d1c4764fb2d999e192b9cf5c53aa5b732d2b6149e67229b8aea5ed4" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.504525 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c7269cb9d1c4764fb2d999e192b9cf5c53aa5b732d2b6149e67229b8aea5ed4"} err="failed to get container status \"4c7269cb9d1c4764fb2d999e192b9cf5c53aa5b732d2b6149e67229b8aea5ed4\": rpc error: code = NotFound desc = could not find container \"4c7269cb9d1c4764fb2d999e192b9cf5c53aa5b732d2b6149e67229b8aea5ed4\": container with ID starting with 4c7269cb9d1c4764fb2d999e192b9cf5c53aa5b732d2b6149e67229b8aea5ed4 not found: ID does not exist" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.504544 4727 scope.go:117] "RemoveContainer" containerID="67bd348f31ceed3cf09214d5e8eed4695980248ab7db8d27627ffcb7df579f9b" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.507454 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8548d350-ee32-44e6-85d2-2e30036d5eb8-utilities\") pod \"8548d350-ee32-44e6-85d2-2e30036d5eb8\" (UID: \"8548d350-ee32-44e6-85d2-2e30036d5eb8\") " Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.507669 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4l2d\" (UniqueName: \"kubernetes.io/projected/8548d350-ee32-44e6-85d2-2e30036d5eb8-kube-api-access-g4l2d\") pod \"8548d350-ee32-44e6-85d2-2e30036d5eb8\" (UID: \"8548d350-ee32-44e6-85d2-2e30036d5eb8\") " Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.508170 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8548d350-ee32-44e6-85d2-2e30036d5eb8-catalog-content\") pod \"8548d350-ee32-44e6-85d2-2e30036d5eb8\" (UID: \"8548d350-ee32-44e6-85d2-2e30036d5eb8\") " Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.514069 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sz95m"] Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.515025 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8548d350-ee32-44e6-85d2-2e30036d5eb8-utilities" (OuterVolumeSpecName: "utilities") pod "8548d350-ee32-44e6-85d2-2e30036d5eb8" (UID: "8548d350-ee32-44e6-85d2-2e30036d5eb8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.517486 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8548d350-ee32-44e6-85d2-2e30036d5eb8-kube-api-access-g4l2d" (OuterVolumeSpecName: "kube-api-access-g4l2d") pod "8548d350-ee32-44e6-85d2-2e30036d5eb8" (UID: "8548d350-ee32-44e6-85d2-2e30036d5eb8"). InnerVolumeSpecName "kube-api-access-g4l2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.534589 4727 scope.go:117] "RemoveContainer" containerID="dca42bdb1470accb73b6ee487d638512b588f9a2fdc4c0365068a922a79016e4" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.534881 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sz95m"] Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.540774 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hj5p4"] Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.559100 4727 scope.go:117] "RemoveContainer" containerID="3ab9b4d3f81f00309d02784c98d65551bcd1344e6137a042bb8f0cd1159a6054" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.560621 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hj5p4"] Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.569466 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8548d350-ee32-44e6-85d2-2e30036d5eb8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8548d350-ee32-44e6-85d2-2e30036d5eb8" (UID: "8548d350-ee32-44e6-85d2-2e30036d5eb8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.571220 4727 scope.go:117] "RemoveContainer" containerID="67bd348f31ceed3cf09214d5e8eed4695980248ab7db8d27627ffcb7df579f9b" Oct 01 12:41:24 crc kubenswrapper[4727]: E1001 12:41:24.571726 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67bd348f31ceed3cf09214d5e8eed4695980248ab7db8d27627ffcb7df579f9b\": container with ID starting with 67bd348f31ceed3cf09214d5e8eed4695980248ab7db8d27627ffcb7df579f9b not found: ID does not exist" containerID="67bd348f31ceed3cf09214d5e8eed4695980248ab7db8d27627ffcb7df579f9b" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.571837 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67bd348f31ceed3cf09214d5e8eed4695980248ab7db8d27627ffcb7df579f9b"} err="failed to get container status \"67bd348f31ceed3cf09214d5e8eed4695980248ab7db8d27627ffcb7df579f9b\": rpc error: code = NotFound desc = could not find container \"67bd348f31ceed3cf09214d5e8eed4695980248ab7db8d27627ffcb7df579f9b\": container with ID starting with 67bd348f31ceed3cf09214d5e8eed4695980248ab7db8d27627ffcb7df579f9b not found: ID does not exist" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.571928 4727 scope.go:117] "RemoveContainer" containerID="dca42bdb1470accb73b6ee487d638512b588f9a2fdc4c0365068a922a79016e4" Oct 01 12:41:24 crc kubenswrapper[4727]: E1001 12:41:24.572544 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dca42bdb1470accb73b6ee487d638512b588f9a2fdc4c0365068a922a79016e4\": container with ID starting with dca42bdb1470accb73b6ee487d638512b588f9a2fdc4c0365068a922a79016e4 not found: ID does not exist" containerID="dca42bdb1470accb73b6ee487d638512b588f9a2fdc4c0365068a922a79016e4" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.572592 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dca42bdb1470accb73b6ee487d638512b588f9a2fdc4c0365068a922a79016e4"} err="failed to get container status \"dca42bdb1470accb73b6ee487d638512b588f9a2fdc4c0365068a922a79016e4\": rpc error: code = NotFound desc = could not find container \"dca42bdb1470accb73b6ee487d638512b588f9a2fdc4c0365068a922a79016e4\": container with ID starting with dca42bdb1470accb73b6ee487d638512b588f9a2fdc4c0365068a922a79016e4 not found: ID does not exist" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.572626 4727 scope.go:117] "RemoveContainer" containerID="3ab9b4d3f81f00309d02784c98d65551bcd1344e6137a042bb8f0cd1159a6054" Oct 01 12:41:24 crc kubenswrapper[4727]: E1001 12:41:24.572888 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ab9b4d3f81f00309d02784c98d65551bcd1344e6137a042bb8f0cd1159a6054\": container with ID starting with 3ab9b4d3f81f00309d02784c98d65551bcd1344e6137a042bb8f0cd1159a6054 not found: ID does not exist" containerID="3ab9b4d3f81f00309d02784c98d65551bcd1344e6137a042bb8f0cd1159a6054" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.573029 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ab9b4d3f81f00309d02784c98d65551bcd1344e6137a042bb8f0cd1159a6054"} err="failed to get container status \"3ab9b4d3f81f00309d02784c98d65551bcd1344e6137a042bb8f0cd1159a6054\": rpc error: code = NotFound desc = could not find container \"3ab9b4d3f81f00309d02784c98d65551bcd1344e6137a042bb8f0cd1159a6054\": container with ID starting with 3ab9b4d3f81f00309d02784c98d65551bcd1344e6137a042bb8f0cd1159a6054 not found: ID does not exist" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.573131 4727 scope.go:117] "RemoveContainer" containerID="a625a79f3dfd18091559a9fc965ba1970c25ed879e4c7a021028f4be86bdd5e2" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.586287 4727 scope.go:117] "RemoveContainer" containerID="a625a79f3dfd18091559a9fc965ba1970c25ed879e4c7a021028f4be86bdd5e2" Oct 01 12:41:24 crc kubenswrapper[4727]: E1001 12:41:24.587300 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a625a79f3dfd18091559a9fc965ba1970c25ed879e4c7a021028f4be86bdd5e2\": container with ID starting with a625a79f3dfd18091559a9fc965ba1970c25ed879e4c7a021028f4be86bdd5e2 not found: ID does not exist" containerID="a625a79f3dfd18091559a9fc965ba1970c25ed879e4c7a021028f4be86bdd5e2" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.587396 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a625a79f3dfd18091559a9fc965ba1970c25ed879e4c7a021028f4be86bdd5e2"} err="failed to get container status \"a625a79f3dfd18091559a9fc965ba1970c25ed879e4c7a021028f4be86bdd5e2\": rpc error: code = NotFound desc = could not find container \"a625a79f3dfd18091559a9fc965ba1970c25ed879e4c7a021028f4be86bdd5e2\": container with ID starting with a625a79f3dfd18091559a9fc965ba1970c25ed879e4c7a021028f4be86bdd5e2 not found: ID does not exist" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.587487 4727 scope.go:117] "RemoveContainer" containerID="9bb260d3282e05f323820fb3d0e0a921868d17b6516b38642c22d4bc25aede31" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.600788 4727 scope.go:117] "RemoveContainer" containerID="d81cdce9e5c45d540de27903de21675e1fecb5452e4c96b593a707a5974344ed" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.609912 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8548d350-ee32-44e6-85d2-2e30036d5eb8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.609936 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8548d350-ee32-44e6-85d2-2e30036d5eb8-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.609948 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4l2d\" (UniqueName: \"kubernetes.io/projected/8548d350-ee32-44e6-85d2-2e30036d5eb8-kube-api-access-g4l2d\") on node \"crc\" DevicePath \"\"" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.614983 4727 scope.go:117] "RemoveContainer" containerID="0dec39fec16013a8387e65a723203929d58d7ea1fcf8e53c988ad0b25b23b35b" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.633880 4727 scope.go:117] "RemoveContainer" containerID="9bb260d3282e05f323820fb3d0e0a921868d17b6516b38642c22d4bc25aede31" Oct 01 12:41:24 crc kubenswrapper[4727]: E1001 12:41:24.634745 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bb260d3282e05f323820fb3d0e0a921868d17b6516b38642c22d4bc25aede31\": container with ID starting with 9bb260d3282e05f323820fb3d0e0a921868d17b6516b38642c22d4bc25aede31 not found: ID does not exist" containerID="9bb260d3282e05f323820fb3d0e0a921868d17b6516b38642c22d4bc25aede31" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.634795 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bb260d3282e05f323820fb3d0e0a921868d17b6516b38642c22d4bc25aede31"} err="failed to get container status \"9bb260d3282e05f323820fb3d0e0a921868d17b6516b38642c22d4bc25aede31\": rpc error: code = NotFound desc = could not find container \"9bb260d3282e05f323820fb3d0e0a921868d17b6516b38642c22d4bc25aede31\": container with ID starting with 9bb260d3282e05f323820fb3d0e0a921868d17b6516b38642c22d4bc25aede31 not found: ID does not exist" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.634829 4727 scope.go:117] "RemoveContainer" containerID="d81cdce9e5c45d540de27903de21675e1fecb5452e4c96b593a707a5974344ed" Oct 01 12:41:24 crc kubenswrapper[4727]: E1001 12:41:24.635254 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d81cdce9e5c45d540de27903de21675e1fecb5452e4c96b593a707a5974344ed\": container with ID starting with d81cdce9e5c45d540de27903de21675e1fecb5452e4c96b593a707a5974344ed not found: ID does not exist" containerID="d81cdce9e5c45d540de27903de21675e1fecb5452e4c96b593a707a5974344ed" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.635284 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d81cdce9e5c45d540de27903de21675e1fecb5452e4c96b593a707a5974344ed"} err="failed to get container status \"d81cdce9e5c45d540de27903de21675e1fecb5452e4c96b593a707a5974344ed\": rpc error: code = NotFound desc = could not find container \"d81cdce9e5c45d540de27903de21675e1fecb5452e4c96b593a707a5974344ed\": container with ID starting with d81cdce9e5c45d540de27903de21675e1fecb5452e4c96b593a707a5974344ed not found: ID does not exist" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.635305 4727 scope.go:117] "RemoveContainer" containerID="0dec39fec16013a8387e65a723203929d58d7ea1fcf8e53c988ad0b25b23b35b" Oct 01 12:41:24 crc kubenswrapper[4727]: E1001 12:41:24.635696 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dec39fec16013a8387e65a723203929d58d7ea1fcf8e53c988ad0b25b23b35b\": container with ID starting with 0dec39fec16013a8387e65a723203929d58d7ea1fcf8e53c988ad0b25b23b35b not found: ID does not exist" containerID="0dec39fec16013a8387e65a723203929d58d7ea1fcf8e53c988ad0b25b23b35b" Oct 01 12:41:24 crc kubenswrapper[4727]: I1001 12:41:24.635725 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dec39fec16013a8387e65a723203929d58d7ea1fcf8e53c988ad0b25b23b35b"} err="failed to get container status \"0dec39fec16013a8387e65a723203929d58d7ea1fcf8e53c988ad0b25b23b35b\": rpc error: code = NotFound desc = could not find container \"0dec39fec16013a8387e65a723203929d58d7ea1fcf8e53c988ad0b25b23b35b\": container with ID starting with 0dec39fec16013a8387e65a723203929d58d7ea1fcf8e53c988ad0b25b23b35b not found: ID does not exist" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.428049 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8jhtf" event={"ID":"f2bd8192-96d6-40cd-877f-3a288140a8e9","Type":"ContainerStarted","Data":"16434f40516e6f41df09dd3765e02047521b7e69f7cc6855c3a8555d5d340f75"} Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.428549 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8jhtf" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.428581 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8jhtf" event={"ID":"f2bd8192-96d6-40cd-877f-3a288140a8e9","Type":"ContainerStarted","Data":"59a99c51c37cc0278d6abcd01a85783cee90816905815a7fd94cd38c3b7b706f"} Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.430589 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8l85n" event={"ID":"8548d350-ee32-44e6-85d2-2e30036d5eb8","Type":"ContainerDied","Data":"8b38457f456bd25dc8a3c39e393d06606a6233a1ba675701a88c901d14d413da"} Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.430639 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8l85n" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.430670 4727 scope.go:117] "RemoveContainer" containerID="3a20d53158f6fc563f553367162de3e24f0261c83d1f7cb33453c517997897ef" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.435419 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8jhtf" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.450161 4727 scope.go:117] "RemoveContainer" containerID="92f21923a9b1375915ebed149f16ea09a8db36ce0e0c84ce69b251086f97b2e3" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.462751 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8jhtf" podStartSLOduration=2.462724627 podStartE2EDuration="2.462724627s" podCreationTimestamp="2025-10-01 12:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:41:25.451243454 +0000 UTC m=+263.772598311" watchObservedRunningTime="2025-10-01 12:41:25.462724627 +0000 UTC m=+263.784079474" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.496432 4727 scope.go:117] "RemoveContainer" containerID="afc8ccf487c14a4f9edf55263037becb617cf50031103899fb6f4e760874f06a" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.503289 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8l85n"] Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.506924 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8l85n"] Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.786383 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dk8rp"] Oct 01 12:41:25 crc kubenswrapper[4727]: E1001 12:41:25.786793 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea389964-1da2-4ade-8772-b8bd1a76cc27" containerName="marketplace-operator" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.786811 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea389964-1da2-4ade-8772-b8bd1a76cc27" containerName="marketplace-operator" Oct 01 12:41:25 crc kubenswrapper[4727]: E1001 12:41:25.786822 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1b0fc07-8033-4220-a491-cc668e795d10" containerName="registry-server" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.786828 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1b0fc07-8033-4220-a491-cc668e795d10" containerName="registry-server" Oct 01 12:41:25 crc kubenswrapper[4727]: E1001 12:41:25.786838 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a39814-1723-46e2-b468-67e6cf668788" containerName="extract-content" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.786845 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a39814-1723-46e2-b468-67e6cf668788" containerName="extract-content" Oct 01 12:41:25 crc kubenswrapper[4727]: E1001 12:41:25.786852 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a39814-1723-46e2-b468-67e6cf668788" containerName="extract-utilities" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.786858 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a39814-1723-46e2-b468-67e6cf668788" containerName="extract-utilities" Oct 01 12:41:25 crc kubenswrapper[4727]: E1001 12:41:25.786868 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="defd3d6f-dd53-4725-af25-c711790c4870" containerName="extract-content" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.786874 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="defd3d6f-dd53-4725-af25-c711790c4870" containerName="extract-content" Oct 01 12:41:25 crc kubenswrapper[4727]: E1001 12:41:25.786882 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8548d350-ee32-44e6-85d2-2e30036d5eb8" containerName="extract-content" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.786888 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="8548d350-ee32-44e6-85d2-2e30036d5eb8" containerName="extract-content" Oct 01 12:41:25 crc kubenswrapper[4727]: E1001 12:41:25.786894 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a39814-1723-46e2-b468-67e6cf668788" containerName="registry-server" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.786899 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a39814-1723-46e2-b468-67e6cf668788" containerName="registry-server" Oct 01 12:41:25 crc kubenswrapper[4727]: E1001 12:41:25.786911 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8548d350-ee32-44e6-85d2-2e30036d5eb8" containerName="registry-server" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.786917 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="8548d350-ee32-44e6-85d2-2e30036d5eb8" containerName="registry-server" Oct 01 12:41:25 crc kubenswrapper[4727]: E1001 12:41:25.786926 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="defd3d6f-dd53-4725-af25-c711790c4870" containerName="registry-server" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.786932 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="defd3d6f-dd53-4725-af25-c711790c4870" containerName="registry-server" Oct 01 12:41:25 crc kubenswrapper[4727]: E1001 12:41:25.786940 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1b0fc07-8033-4220-a491-cc668e795d10" containerName="extract-utilities" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.786947 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1b0fc07-8033-4220-a491-cc668e795d10" containerName="extract-utilities" Oct 01 12:41:25 crc kubenswrapper[4727]: E1001 12:41:25.786957 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1b0fc07-8033-4220-a491-cc668e795d10" containerName="extract-content" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.786963 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1b0fc07-8033-4220-a491-cc668e795d10" containerName="extract-content" Oct 01 12:41:25 crc kubenswrapper[4727]: E1001 12:41:25.786972 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8548d350-ee32-44e6-85d2-2e30036d5eb8" containerName="extract-utilities" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.786977 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="8548d350-ee32-44e6-85d2-2e30036d5eb8" containerName="extract-utilities" Oct 01 12:41:25 crc kubenswrapper[4727]: E1001 12:41:25.786986 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="defd3d6f-dd53-4725-af25-c711790c4870" containerName="extract-utilities" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.786991 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="defd3d6f-dd53-4725-af25-c711790c4870" containerName="extract-utilities" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.787109 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1b0fc07-8033-4220-a491-cc668e795d10" containerName="registry-server" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.787125 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="8548d350-ee32-44e6-85d2-2e30036d5eb8" containerName="registry-server" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.787136 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9a39814-1723-46e2-b468-67e6cf668788" containerName="registry-server" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.787143 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea389964-1da2-4ade-8772-b8bd1a76cc27" containerName="marketplace-operator" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.787150 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="defd3d6f-dd53-4725-af25-c711790c4870" containerName="registry-server" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.787815 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dk8rp" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.793171 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.802499 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dk8rp"] Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.833144 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b24a2466-b1b7-4da7-bc8a-03d9add0de40-catalog-content\") pod \"redhat-marketplace-dk8rp\" (UID: \"b24a2466-b1b7-4da7-bc8a-03d9add0de40\") " pod="openshift-marketplace/redhat-marketplace-dk8rp" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.833189 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b24a2466-b1b7-4da7-bc8a-03d9add0de40-utilities\") pod \"redhat-marketplace-dk8rp\" (UID: \"b24a2466-b1b7-4da7-bc8a-03d9add0de40\") " pod="openshift-marketplace/redhat-marketplace-dk8rp" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.833244 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjstz\" (UniqueName: \"kubernetes.io/projected/b24a2466-b1b7-4da7-bc8a-03d9add0de40-kube-api-access-rjstz\") pod \"redhat-marketplace-dk8rp\" (UID: \"b24a2466-b1b7-4da7-bc8a-03d9add0de40\") " pod="openshift-marketplace/redhat-marketplace-dk8rp" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.934964 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b24a2466-b1b7-4da7-bc8a-03d9add0de40-catalog-content\") pod \"redhat-marketplace-dk8rp\" (UID: \"b24a2466-b1b7-4da7-bc8a-03d9add0de40\") " pod="openshift-marketplace/redhat-marketplace-dk8rp" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.935408 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b24a2466-b1b7-4da7-bc8a-03d9add0de40-utilities\") pod \"redhat-marketplace-dk8rp\" (UID: \"b24a2466-b1b7-4da7-bc8a-03d9add0de40\") " pod="openshift-marketplace/redhat-marketplace-dk8rp" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.935429 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b24a2466-b1b7-4da7-bc8a-03d9add0de40-catalog-content\") pod \"redhat-marketplace-dk8rp\" (UID: \"b24a2466-b1b7-4da7-bc8a-03d9add0de40\") " pod="openshift-marketplace/redhat-marketplace-dk8rp" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.935509 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjstz\" (UniqueName: \"kubernetes.io/projected/b24a2466-b1b7-4da7-bc8a-03d9add0de40-kube-api-access-rjstz\") pod \"redhat-marketplace-dk8rp\" (UID: \"b24a2466-b1b7-4da7-bc8a-03d9add0de40\") " pod="openshift-marketplace/redhat-marketplace-dk8rp" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.936058 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b24a2466-b1b7-4da7-bc8a-03d9add0de40-utilities\") pod \"redhat-marketplace-dk8rp\" (UID: \"b24a2466-b1b7-4da7-bc8a-03d9add0de40\") " pod="openshift-marketplace/redhat-marketplace-dk8rp" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.955302 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjstz\" (UniqueName: \"kubernetes.io/projected/b24a2466-b1b7-4da7-bc8a-03d9add0de40-kube-api-access-rjstz\") pod \"redhat-marketplace-dk8rp\" (UID: \"b24a2466-b1b7-4da7-bc8a-03d9add0de40\") " pod="openshift-marketplace/redhat-marketplace-dk8rp" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.988892 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xq5nj"] Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.991714 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xq5nj" Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.997545 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xq5nj"] Oct 01 12:41:25 crc kubenswrapper[4727]: I1001 12:41:25.999180 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 01 12:41:26 crc kubenswrapper[4727]: I1001 12:41:26.036421 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d2b91c6-4b36-4774-b2ca-59e9d5757b15-catalog-content\") pod \"community-operators-xq5nj\" (UID: \"9d2b91c6-4b36-4774-b2ca-59e9d5757b15\") " pod="openshift-marketplace/community-operators-xq5nj" Oct 01 12:41:26 crc kubenswrapper[4727]: I1001 12:41:26.036484 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2ptk\" (UniqueName: \"kubernetes.io/projected/9d2b91c6-4b36-4774-b2ca-59e9d5757b15-kube-api-access-j2ptk\") pod \"community-operators-xq5nj\" (UID: \"9d2b91c6-4b36-4774-b2ca-59e9d5757b15\") " pod="openshift-marketplace/community-operators-xq5nj" Oct 01 12:41:26 crc kubenswrapper[4727]: I1001 12:41:26.036523 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d2b91c6-4b36-4774-b2ca-59e9d5757b15-utilities\") pod \"community-operators-xq5nj\" (UID: \"9d2b91c6-4b36-4774-b2ca-59e9d5757b15\") " pod="openshift-marketplace/community-operators-xq5nj" Oct 01 12:41:26 crc kubenswrapper[4727]: I1001 12:41:26.138156 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d2b91c6-4b36-4774-b2ca-59e9d5757b15-catalog-content\") pod \"community-operators-xq5nj\" (UID: \"9d2b91c6-4b36-4774-b2ca-59e9d5757b15\") " pod="openshift-marketplace/community-operators-xq5nj" Oct 01 12:41:26 crc kubenswrapper[4727]: I1001 12:41:26.138278 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2ptk\" (UniqueName: \"kubernetes.io/projected/9d2b91c6-4b36-4774-b2ca-59e9d5757b15-kube-api-access-j2ptk\") pod \"community-operators-xq5nj\" (UID: \"9d2b91c6-4b36-4774-b2ca-59e9d5757b15\") " pod="openshift-marketplace/community-operators-xq5nj" Oct 01 12:41:26 crc kubenswrapper[4727]: I1001 12:41:26.138353 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d2b91c6-4b36-4774-b2ca-59e9d5757b15-utilities\") pod \"community-operators-xq5nj\" (UID: \"9d2b91c6-4b36-4774-b2ca-59e9d5757b15\") " pod="openshift-marketplace/community-operators-xq5nj" Oct 01 12:41:26 crc kubenswrapper[4727]: I1001 12:41:26.138775 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d2b91c6-4b36-4774-b2ca-59e9d5757b15-catalog-content\") pod \"community-operators-xq5nj\" (UID: \"9d2b91c6-4b36-4774-b2ca-59e9d5757b15\") " pod="openshift-marketplace/community-operators-xq5nj" Oct 01 12:41:26 crc kubenswrapper[4727]: I1001 12:41:26.138929 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d2b91c6-4b36-4774-b2ca-59e9d5757b15-utilities\") pod \"community-operators-xq5nj\" (UID: \"9d2b91c6-4b36-4774-b2ca-59e9d5757b15\") " pod="openshift-marketplace/community-operators-xq5nj" Oct 01 12:41:26 crc kubenswrapper[4727]: I1001 12:41:26.141477 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dk8rp" Oct 01 12:41:26 crc kubenswrapper[4727]: I1001 12:41:26.160680 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2ptk\" (UniqueName: \"kubernetes.io/projected/9d2b91c6-4b36-4774-b2ca-59e9d5757b15-kube-api-access-j2ptk\") pod \"community-operators-xq5nj\" (UID: \"9d2b91c6-4b36-4774-b2ca-59e9d5757b15\") " pod="openshift-marketplace/community-operators-xq5nj" Oct 01 12:41:26 crc kubenswrapper[4727]: I1001 12:41:26.307315 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xq5nj" Oct 01 12:41:26 crc kubenswrapper[4727]: I1001 12:41:26.351635 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dk8rp"] Oct 01 12:41:26 crc kubenswrapper[4727]: W1001 12:41:26.352435 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb24a2466_b1b7_4da7_bc8a_03d9add0de40.slice/crio-1d8750d1b7558aff13678f58449fcfa2b1ecfc01694a0c4cd7acf006eb908f21 WatchSource:0}: Error finding container 1d8750d1b7558aff13678f58449fcfa2b1ecfc01694a0c4cd7acf006eb908f21: Status 404 returned error can't find the container with id 1d8750d1b7558aff13678f58449fcfa2b1ecfc01694a0c4cd7acf006eb908f21 Oct 01 12:41:26 crc kubenswrapper[4727]: I1001 12:41:26.380585 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8548d350-ee32-44e6-85d2-2e30036d5eb8" path="/var/lib/kubelet/pods/8548d350-ee32-44e6-85d2-2e30036d5eb8/volumes" Oct 01 12:41:26 crc kubenswrapper[4727]: I1001 12:41:26.381381 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9a39814-1723-46e2-b468-67e6cf668788" path="/var/lib/kubelet/pods/c9a39814-1723-46e2-b468-67e6cf668788/volumes" Oct 01 12:41:26 crc kubenswrapper[4727]: I1001 12:41:26.381957 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1b0fc07-8033-4220-a491-cc668e795d10" path="/var/lib/kubelet/pods/d1b0fc07-8033-4220-a491-cc668e795d10/volumes" Oct 01 12:41:26 crc kubenswrapper[4727]: I1001 12:41:26.383034 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="defd3d6f-dd53-4725-af25-c711790c4870" path="/var/lib/kubelet/pods/defd3d6f-dd53-4725-af25-c711790c4870/volumes" Oct 01 12:41:26 crc kubenswrapper[4727]: I1001 12:41:26.383666 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea389964-1da2-4ade-8772-b8bd1a76cc27" path="/var/lib/kubelet/pods/ea389964-1da2-4ade-8772-b8bd1a76cc27/volumes" Oct 01 12:41:26 crc kubenswrapper[4727]: I1001 12:41:26.451643 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dk8rp" event={"ID":"b24a2466-b1b7-4da7-bc8a-03d9add0de40","Type":"ContainerStarted","Data":"1d8750d1b7558aff13678f58449fcfa2b1ecfc01694a0c4cd7acf006eb908f21"} Oct 01 12:41:26 crc kubenswrapper[4727]: I1001 12:41:26.496379 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xq5nj"] Oct 01 12:41:26 crc kubenswrapper[4727]: W1001 12:41:26.500533 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d2b91c6_4b36_4774_b2ca_59e9d5757b15.slice/crio-760d593e15eb22ca87500e37f7a149a2a97f205a6284b509861bd990b973de5c WatchSource:0}: Error finding container 760d593e15eb22ca87500e37f7a149a2a97f205a6284b509861bd990b973de5c: Status 404 returned error can't find the container with id 760d593e15eb22ca87500e37f7a149a2a97f205a6284b509861bd990b973de5c Oct 01 12:41:27 crc kubenswrapper[4727]: I1001 12:41:27.462554 4727 generic.go:334] "Generic (PLEG): container finished" podID="b24a2466-b1b7-4da7-bc8a-03d9add0de40" containerID="3fa1a4de04b0eb348595cdbe8f590c869ad91f258577aa49dee698c66b2219f8" exitCode=0 Oct 01 12:41:27 crc kubenswrapper[4727]: I1001 12:41:27.462818 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dk8rp" event={"ID":"b24a2466-b1b7-4da7-bc8a-03d9add0de40","Type":"ContainerDied","Data":"3fa1a4de04b0eb348595cdbe8f590c869ad91f258577aa49dee698c66b2219f8"} Oct 01 12:41:27 crc kubenswrapper[4727]: I1001 12:41:27.466622 4727 generic.go:334] "Generic (PLEG): container finished" podID="9d2b91c6-4b36-4774-b2ca-59e9d5757b15" containerID="e4790dfaee999fd6418a51ef61fd59b42a872e0db023b724bd69beb252367053" exitCode=0 Oct 01 12:41:27 crc kubenswrapper[4727]: I1001 12:41:27.466815 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xq5nj" event={"ID":"9d2b91c6-4b36-4774-b2ca-59e9d5757b15","Type":"ContainerDied","Data":"e4790dfaee999fd6418a51ef61fd59b42a872e0db023b724bd69beb252367053"} Oct 01 12:41:27 crc kubenswrapper[4727]: I1001 12:41:27.466857 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xq5nj" event={"ID":"9d2b91c6-4b36-4774-b2ca-59e9d5757b15","Type":"ContainerStarted","Data":"760d593e15eb22ca87500e37f7a149a2a97f205a6284b509861bd990b973de5c"} Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.186924 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s4cld"] Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.189759 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s4cld" Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.191540 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.210499 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s4cld"] Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.263970 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd4c2104-54e9-42be-8a78-b4674c3e7b7c-utilities\") pod \"certified-operators-s4cld\" (UID: \"cd4c2104-54e9-42be-8a78-b4674c3e7b7c\") " pod="openshift-marketplace/certified-operators-s4cld" Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.264205 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd4c2104-54e9-42be-8a78-b4674c3e7b7c-catalog-content\") pod \"certified-operators-s4cld\" (UID: \"cd4c2104-54e9-42be-8a78-b4674c3e7b7c\") " pod="openshift-marketplace/certified-operators-s4cld" Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.264546 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss5hb\" (UniqueName: \"kubernetes.io/projected/cd4c2104-54e9-42be-8a78-b4674c3e7b7c-kube-api-access-ss5hb\") pod \"certified-operators-s4cld\" (UID: \"cd4c2104-54e9-42be-8a78-b4674c3e7b7c\") " pod="openshift-marketplace/certified-operators-s4cld" Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.365872 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss5hb\" (UniqueName: \"kubernetes.io/projected/cd4c2104-54e9-42be-8a78-b4674c3e7b7c-kube-api-access-ss5hb\") pod \"certified-operators-s4cld\" (UID: \"cd4c2104-54e9-42be-8a78-b4674c3e7b7c\") " pod="openshift-marketplace/certified-operators-s4cld" Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.365945 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd4c2104-54e9-42be-8a78-b4674c3e7b7c-utilities\") pod \"certified-operators-s4cld\" (UID: \"cd4c2104-54e9-42be-8a78-b4674c3e7b7c\") " pod="openshift-marketplace/certified-operators-s4cld" Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.365970 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd4c2104-54e9-42be-8a78-b4674c3e7b7c-catalog-content\") pod \"certified-operators-s4cld\" (UID: \"cd4c2104-54e9-42be-8a78-b4674c3e7b7c\") " pod="openshift-marketplace/certified-operators-s4cld" Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.366406 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd4c2104-54e9-42be-8a78-b4674c3e7b7c-catalog-content\") pod \"certified-operators-s4cld\" (UID: \"cd4c2104-54e9-42be-8a78-b4674c3e7b7c\") " pod="openshift-marketplace/certified-operators-s4cld" Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.366953 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd4c2104-54e9-42be-8a78-b4674c3e7b7c-utilities\") pod \"certified-operators-s4cld\" (UID: \"cd4c2104-54e9-42be-8a78-b4674c3e7b7c\") " pod="openshift-marketplace/certified-operators-s4cld" Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.389741 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bcnlj"] Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.390912 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bcnlj" Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.393232 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.397870 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss5hb\" (UniqueName: \"kubernetes.io/projected/cd4c2104-54e9-42be-8a78-b4674c3e7b7c-kube-api-access-ss5hb\") pod \"certified-operators-s4cld\" (UID: \"cd4c2104-54e9-42be-8a78-b4674c3e7b7c\") " pod="openshift-marketplace/certified-operators-s4cld" Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.399677 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bcnlj"] Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.466654 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dac96fe7-54ef-42de-93da-4c8ea9b2f1df-catalog-content\") pod \"redhat-operators-bcnlj\" (UID: \"dac96fe7-54ef-42de-93da-4c8ea9b2f1df\") " pod="openshift-marketplace/redhat-operators-bcnlj" Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.466719 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxw5c\" (UniqueName: \"kubernetes.io/projected/dac96fe7-54ef-42de-93da-4c8ea9b2f1df-kube-api-access-zxw5c\") pod \"redhat-operators-bcnlj\" (UID: \"dac96fe7-54ef-42de-93da-4c8ea9b2f1df\") " pod="openshift-marketplace/redhat-operators-bcnlj" Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.466752 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dac96fe7-54ef-42de-93da-4c8ea9b2f1df-utilities\") pod \"redhat-operators-bcnlj\" (UID: \"dac96fe7-54ef-42de-93da-4c8ea9b2f1df\") " pod="openshift-marketplace/redhat-operators-bcnlj" Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.474344 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dk8rp" event={"ID":"b24a2466-b1b7-4da7-bc8a-03d9add0de40","Type":"ContainerStarted","Data":"4be20e3a9aef7e1a3a69f75372bbc4b1c7ebc444df9b649b7bf1b4d6badac1ab"} Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.519622 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s4cld" Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.567970 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxw5c\" (UniqueName: \"kubernetes.io/projected/dac96fe7-54ef-42de-93da-4c8ea9b2f1df-kube-api-access-zxw5c\") pod \"redhat-operators-bcnlj\" (UID: \"dac96fe7-54ef-42de-93da-4c8ea9b2f1df\") " pod="openshift-marketplace/redhat-operators-bcnlj" Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.568043 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dac96fe7-54ef-42de-93da-4c8ea9b2f1df-utilities\") pod \"redhat-operators-bcnlj\" (UID: \"dac96fe7-54ef-42de-93da-4c8ea9b2f1df\") " pod="openshift-marketplace/redhat-operators-bcnlj" Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.568139 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dac96fe7-54ef-42de-93da-4c8ea9b2f1df-catalog-content\") pod \"redhat-operators-bcnlj\" (UID: \"dac96fe7-54ef-42de-93da-4c8ea9b2f1df\") " pod="openshift-marketplace/redhat-operators-bcnlj" Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.568646 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dac96fe7-54ef-42de-93da-4c8ea9b2f1df-catalog-content\") pod \"redhat-operators-bcnlj\" (UID: \"dac96fe7-54ef-42de-93da-4c8ea9b2f1df\") " pod="openshift-marketplace/redhat-operators-bcnlj" Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.568918 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dac96fe7-54ef-42de-93da-4c8ea9b2f1df-utilities\") pod \"redhat-operators-bcnlj\" (UID: \"dac96fe7-54ef-42de-93da-4c8ea9b2f1df\") " pod="openshift-marketplace/redhat-operators-bcnlj" Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.586146 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxw5c\" (UniqueName: \"kubernetes.io/projected/dac96fe7-54ef-42de-93da-4c8ea9b2f1df-kube-api-access-zxw5c\") pod \"redhat-operators-bcnlj\" (UID: \"dac96fe7-54ef-42de-93da-4c8ea9b2f1df\") " pod="openshift-marketplace/redhat-operators-bcnlj" Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.700035 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s4cld"] Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.734091 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bcnlj" Oct 01 12:41:28 crc kubenswrapper[4727]: I1001 12:41:28.930432 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bcnlj"] Oct 01 12:41:28 crc kubenswrapper[4727]: W1001 12:41:28.937378 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddac96fe7_54ef_42de_93da_4c8ea9b2f1df.slice/crio-e29f8b4be5526269e28144991cbf14d6fbb1884eaf9230f9b70558b0b1dff38e WatchSource:0}: Error finding container e29f8b4be5526269e28144991cbf14d6fbb1884eaf9230f9b70558b0b1dff38e: Status 404 returned error can't find the container with id e29f8b4be5526269e28144991cbf14d6fbb1884eaf9230f9b70558b0b1dff38e Oct 01 12:41:29 crc kubenswrapper[4727]: I1001 12:41:29.487063 4727 generic.go:334] "Generic (PLEG): container finished" podID="b24a2466-b1b7-4da7-bc8a-03d9add0de40" containerID="4be20e3a9aef7e1a3a69f75372bbc4b1c7ebc444df9b649b7bf1b4d6badac1ab" exitCode=0 Oct 01 12:41:29 crc kubenswrapper[4727]: I1001 12:41:29.487214 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dk8rp" event={"ID":"b24a2466-b1b7-4da7-bc8a-03d9add0de40","Type":"ContainerDied","Data":"4be20e3a9aef7e1a3a69f75372bbc4b1c7ebc444df9b649b7bf1b4d6badac1ab"} Oct 01 12:41:29 crc kubenswrapper[4727]: I1001 12:41:29.492986 4727 generic.go:334] "Generic (PLEG): container finished" podID="cd4c2104-54e9-42be-8a78-b4674c3e7b7c" containerID="a0892341d9ba3566f0788a868077aa5879d75be97c144238bb3bdc0cca7ac88c" exitCode=0 Oct 01 12:41:29 crc kubenswrapper[4727]: I1001 12:41:29.493102 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4cld" event={"ID":"cd4c2104-54e9-42be-8a78-b4674c3e7b7c","Type":"ContainerDied","Data":"a0892341d9ba3566f0788a868077aa5879d75be97c144238bb3bdc0cca7ac88c"} Oct 01 12:41:29 crc kubenswrapper[4727]: I1001 12:41:29.493195 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4cld" event={"ID":"cd4c2104-54e9-42be-8a78-b4674c3e7b7c","Type":"ContainerStarted","Data":"8ea56c41a60546fa55b4a04dc35b1889534d1cdf75282befc7328bcd05bcec97"} Oct 01 12:41:29 crc kubenswrapper[4727]: I1001 12:41:29.496754 4727 generic.go:334] "Generic (PLEG): container finished" podID="9d2b91c6-4b36-4774-b2ca-59e9d5757b15" containerID="cecf61afc85e9b0133a379fd43c172e844cd2d33ac9eef6b3d7da61e145c8082" exitCode=0 Oct 01 12:41:29 crc kubenswrapper[4727]: I1001 12:41:29.496832 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xq5nj" event={"ID":"9d2b91c6-4b36-4774-b2ca-59e9d5757b15","Type":"ContainerDied","Data":"cecf61afc85e9b0133a379fd43c172e844cd2d33ac9eef6b3d7da61e145c8082"} Oct 01 12:41:29 crc kubenswrapper[4727]: I1001 12:41:29.501435 4727 generic.go:334] "Generic (PLEG): container finished" podID="dac96fe7-54ef-42de-93da-4c8ea9b2f1df" containerID="3e1915404c2d754c82cdfedae65f9c48cadb886978e078dc27f770c02a0922d5" exitCode=0 Oct 01 12:41:29 crc kubenswrapper[4727]: I1001 12:41:29.501469 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bcnlj" event={"ID":"dac96fe7-54ef-42de-93da-4c8ea9b2f1df","Type":"ContainerDied","Data":"3e1915404c2d754c82cdfedae65f9c48cadb886978e078dc27f770c02a0922d5"} Oct 01 12:41:29 crc kubenswrapper[4727]: I1001 12:41:29.501489 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bcnlj" event={"ID":"dac96fe7-54ef-42de-93da-4c8ea9b2f1df","Type":"ContainerStarted","Data":"e29f8b4be5526269e28144991cbf14d6fbb1884eaf9230f9b70558b0b1dff38e"} Oct 01 12:41:30 crc kubenswrapper[4727]: I1001 12:41:30.512401 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4cld" event={"ID":"cd4c2104-54e9-42be-8a78-b4674c3e7b7c","Type":"ContainerStarted","Data":"ef6bb3ac2f4b96e22cbc919af3513d6616146d38d7d830e6fafd5de50cf66ca9"} Oct 01 12:41:30 crc kubenswrapper[4727]: I1001 12:41:30.516543 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xq5nj" event={"ID":"9d2b91c6-4b36-4774-b2ca-59e9d5757b15","Type":"ContainerStarted","Data":"6189181f64c7f6105ede82dbbe63ecea4bb2e456cd175bcf149f34ff28e6e056"} Oct 01 12:41:30 crc kubenswrapper[4727]: I1001 12:41:30.518666 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dk8rp" event={"ID":"b24a2466-b1b7-4da7-bc8a-03d9add0de40","Type":"ContainerStarted","Data":"6f4c96594dce8e8240690b82d546b0fcb5e04fce05f8512bc2fdc4e46c6a275b"} Oct 01 12:41:30 crc kubenswrapper[4727]: I1001 12:41:30.570097 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xq5nj" podStartSLOduration=2.750391593 podStartE2EDuration="5.570076157s" podCreationTimestamp="2025-10-01 12:41:25 +0000 UTC" firstStartedPulling="2025-10-01 12:41:27.47553099 +0000 UTC m=+265.796885827" lastFinishedPulling="2025-10-01 12:41:30.295215554 +0000 UTC m=+268.616570391" observedRunningTime="2025-10-01 12:41:30.569767177 +0000 UTC m=+268.891122024" watchObservedRunningTime="2025-10-01 12:41:30.570076157 +0000 UTC m=+268.891430994" Oct 01 12:41:30 crc kubenswrapper[4727]: I1001 12:41:30.592897 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dk8rp" podStartSLOduration=3.0736448100000002 podStartE2EDuration="5.592880067s" podCreationTimestamp="2025-10-01 12:41:25 +0000 UTC" firstStartedPulling="2025-10-01 12:41:27.466131557 +0000 UTC m=+265.787486394" lastFinishedPulling="2025-10-01 12:41:29.985366824 +0000 UTC m=+268.306721651" observedRunningTime="2025-10-01 12:41:30.59027831 +0000 UTC m=+268.911633157" watchObservedRunningTime="2025-10-01 12:41:30.592880067 +0000 UTC m=+268.914234904" Oct 01 12:41:31 crc kubenswrapper[4727]: I1001 12:41:31.525093 4727 generic.go:334] "Generic (PLEG): container finished" podID="cd4c2104-54e9-42be-8a78-b4674c3e7b7c" containerID="ef6bb3ac2f4b96e22cbc919af3513d6616146d38d7d830e6fafd5de50cf66ca9" exitCode=0 Oct 01 12:41:31 crc kubenswrapper[4727]: I1001 12:41:31.525856 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4cld" event={"ID":"cd4c2104-54e9-42be-8a78-b4674c3e7b7c","Type":"ContainerDied","Data":"ef6bb3ac2f4b96e22cbc919af3513d6616146d38d7d830e6fafd5de50cf66ca9"} Oct 01 12:41:31 crc kubenswrapper[4727]: I1001 12:41:31.532364 4727 generic.go:334] "Generic (PLEG): container finished" podID="dac96fe7-54ef-42de-93da-4c8ea9b2f1df" containerID="e64326dd3c07e03c442747a7475ea92b8cc8cb05ea849b4ccddac47839c421dc" exitCode=0 Oct 01 12:41:31 crc kubenswrapper[4727]: I1001 12:41:31.532639 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bcnlj" event={"ID":"dac96fe7-54ef-42de-93da-4c8ea9b2f1df","Type":"ContainerDied","Data":"e64326dd3c07e03c442747a7475ea92b8cc8cb05ea849b4ccddac47839c421dc"} Oct 01 12:41:33 crc kubenswrapper[4727]: I1001 12:41:33.548727 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4cld" event={"ID":"cd4c2104-54e9-42be-8a78-b4674c3e7b7c","Type":"ContainerStarted","Data":"dfc464793bcf83e1871fb504703d501d84f985b9bbced744020309514952fe4b"} Oct 01 12:41:33 crc kubenswrapper[4727]: I1001 12:41:33.552338 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bcnlj" event={"ID":"dac96fe7-54ef-42de-93da-4c8ea9b2f1df","Type":"ContainerStarted","Data":"00608454bf12ef580dc74d1a697fc1ce159f9ffc2c1f6b80c385d3fdc8b07c07"} Oct 01 12:41:33 crc kubenswrapper[4727]: I1001 12:41:33.567483 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s4cld" podStartSLOduration=2.978440282 podStartE2EDuration="5.567464245s" podCreationTimestamp="2025-10-01 12:41:28 +0000 UTC" firstStartedPulling="2025-10-01 12:41:29.495058428 +0000 UTC m=+267.816413265" lastFinishedPulling="2025-10-01 12:41:32.084082391 +0000 UTC m=+270.405437228" observedRunningTime="2025-10-01 12:41:33.564724894 +0000 UTC m=+271.886079741" watchObservedRunningTime="2025-10-01 12:41:33.567464245 +0000 UTC m=+271.888819102" Oct 01 12:41:33 crc kubenswrapper[4727]: I1001 12:41:33.583028 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bcnlj" podStartSLOduration=2.910436514 podStartE2EDuration="5.582991373s" podCreationTimestamp="2025-10-01 12:41:28 +0000 UTC" firstStartedPulling="2025-10-01 12:41:29.502581308 +0000 UTC m=+267.823936145" lastFinishedPulling="2025-10-01 12:41:32.175136167 +0000 UTC m=+270.496491004" observedRunningTime="2025-10-01 12:41:33.579689273 +0000 UTC m=+271.901044140" watchObservedRunningTime="2025-10-01 12:41:33.582991373 +0000 UTC m=+271.904346220" Oct 01 12:41:36 crc kubenswrapper[4727]: I1001 12:41:36.142359 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dk8rp" Oct 01 12:41:36 crc kubenswrapper[4727]: I1001 12:41:36.143557 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dk8rp" Oct 01 12:41:36 crc kubenswrapper[4727]: I1001 12:41:36.208580 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dk8rp" Oct 01 12:41:36 crc kubenswrapper[4727]: I1001 12:41:36.308803 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xq5nj" Oct 01 12:41:36 crc kubenswrapper[4727]: I1001 12:41:36.309831 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xq5nj" Oct 01 12:41:36 crc kubenswrapper[4727]: I1001 12:41:36.347115 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xq5nj" Oct 01 12:41:36 crc kubenswrapper[4727]: I1001 12:41:36.606719 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dk8rp" Oct 01 12:41:36 crc kubenswrapper[4727]: I1001 12:41:36.609788 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xq5nj" Oct 01 12:41:38 crc kubenswrapper[4727]: I1001 12:41:38.520669 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s4cld" Oct 01 12:41:38 crc kubenswrapper[4727]: I1001 12:41:38.521022 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s4cld" Oct 01 12:41:38 crc kubenswrapper[4727]: I1001 12:41:38.589406 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s4cld" Oct 01 12:41:38 crc kubenswrapper[4727]: I1001 12:41:38.633769 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s4cld" Oct 01 12:41:38 crc kubenswrapper[4727]: I1001 12:41:38.734733 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bcnlj" Oct 01 12:41:38 crc kubenswrapper[4727]: I1001 12:41:38.734793 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bcnlj" Oct 01 12:41:38 crc kubenswrapper[4727]: I1001 12:41:38.777930 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bcnlj" Oct 01 12:41:39 crc kubenswrapper[4727]: I1001 12:41:39.651884 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bcnlj" Oct 01 12:42:33 crc kubenswrapper[4727]: I1001 12:42:33.291813 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:42:33 crc kubenswrapper[4727]: I1001 12:42:33.292477 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:43:03 crc kubenswrapper[4727]: I1001 12:43:03.292670 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:43:03 crc kubenswrapper[4727]: I1001 12:43:03.293873 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:43:33 crc kubenswrapper[4727]: I1001 12:43:33.299362 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:43:33 crc kubenswrapper[4727]: I1001 12:43:33.299995 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:43:33 crc kubenswrapper[4727]: I1001 12:43:33.300442 4727 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" Oct 01 12:43:33 crc kubenswrapper[4727]: I1001 12:43:33.301818 4727 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ff528fc413a67120cbfce88f98833b8fdf8ba19775f84a05229bef0f923e8a19"} pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 12:43:33 crc kubenswrapper[4727]: I1001 12:43:33.301890 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" containerID="cri-o://ff528fc413a67120cbfce88f98833b8fdf8ba19775f84a05229bef0f923e8a19" gracePeriod=600 Oct 01 12:43:34 crc kubenswrapper[4727]: I1001 12:43:34.317408 4727 generic.go:334] "Generic (PLEG): container finished" podID="d18290ae-64a5-44a5-a704-90977d85852b" containerID="ff528fc413a67120cbfce88f98833b8fdf8ba19775f84a05229bef0f923e8a19" exitCode=0 Oct 01 12:43:34 crc kubenswrapper[4727]: I1001 12:43:34.317507 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" event={"ID":"d18290ae-64a5-44a5-a704-90977d85852b","Type":"ContainerDied","Data":"ff528fc413a67120cbfce88f98833b8fdf8ba19775f84a05229bef0f923e8a19"} Oct 01 12:43:34 crc kubenswrapper[4727]: I1001 12:43:34.317870 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" event={"ID":"d18290ae-64a5-44a5-a704-90977d85852b","Type":"ContainerStarted","Data":"12b4ca882a4878e5a3279395bccb6e21a4ad58217420588536e4e56fcd67eeb7"} Oct 01 12:43:34 crc kubenswrapper[4727]: I1001 12:43:34.317912 4727 scope.go:117] "RemoveContainer" containerID="d2ac0138b2b2077af1e2a68fda588e8d59f457561930e3003256cb9c91e4bdca" Oct 01 12:44:15 crc kubenswrapper[4727]: I1001 12:44:15.683197 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xl7mt"] Oct 01 12:44:15 crc kubenswrapper[4727]: I1001 12:44:15.684541 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" Oct 01 12:44:15 crc kubenswrapper[4727]: I1001 12:44:15.714288 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xl7mt"] Oct 01 12:44:15 crc kubenswrapper[4727]: I1001 12:44:15.819883 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/67f77fbb-d575-4c6d-8e82-0129cb3fb9c8-registry-certificates\") pod \"image-registry-66df7c8f76-xl7mt\" (UID: \"67f77fbb-d575-4c6d-8e82-0129cb3fb9c8\") " pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" Oct 01 12:44:15 crc kubenswrapper[4727]: I1001 12:44:15.819943 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/67f77fbb-d575-4c6d-8e82-0129cb3fb9c8-registry-tls\") pod \"image-registry-66df7c8f76-xl7mt\" (UID: \"67f77fbb-d575-4c6d-8e82-0129cb3fb9c8\") " pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" Oct 01 12:44:15 crc kubenswrapper[4727]: I1001 12:44:15.819976 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/67f77fbb-d575-4c6d-8e82-0129cb3fb9c8-bound-sa-token\") pod \"image-registry-66df7c8f76-xl7mt\" (UID: \"67f77fbb-d575-4c6d-8e82-0129cb3fb9c8\") " pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" Oct 01 12:44:15 crc kubenswrapper[4727]: I1001 12:44:15.820091 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-xl7mt\" (UID: \"67f77fbb-d575-4c6d-8e82-0129cb3fb9c8\") " pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" Oct 01 12:44:15 crc kubenswrapper[4727]: I1001 12:44:15.820167 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/67f77fbb-d575-4c6d-8e82-0129cb3fb9c8-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xl7mt\" (UID: \"67f77fbb-d575-4c6d-8e82-0129cb3fb9c8\") " pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" Oct 01 12:44:15 crc kubenswrapper[4727]: I1001 12:44:15.820207 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jhqw\" (UniqueName: \"kubernetes.io/projected/67f77fbb-d575-4c6d-8e82-0129cb3fb9c8-kube-api-access-5jhqw\") pod \"image-registry-66df7c8f76-xl7mt\" (UID: \"67f77fbb-d575-4c6d-8e82-0129cb3fb9c8\") " pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" Oct 01 12:44:15 crc kubenswrapper[4727]: I1001 12:44:15.820266 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/67f77fbb-d575-4c6d-8e82-0129cb3fb9c8-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xl7mt\" (UID: \"67f77fbb-d575-4c6d-8e82-0129cb3fb9c8\") " pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" Oct 01 12:44:15 crc kubenswrapper[4727]: I1001 12:44:15.820365 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/67f77fbb-d575-4c6d-8e82-0129cb3fb9c8-trusted-ca\") pod \"image-registry-66df7c8f76-xl7mt\" (UID: \"67f77fbb-d575-4c6d-8e82-0129cb3fb9c8\") " pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" Oct 01 12:44:15 crc kubenswrapper[4727]: I1001 12:44:15.843580 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-xl7mt\" (UID: \"67f77fbb-d575-4c6d-8e82-0129cb3fb9c8\") " pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" Oct 01 12:44:15 crc kubenswrapper[4727]: I1001 12:44:15.921989 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/67f77fbb-d575-4c6d-8e82-0129cb3fb9c8-registry-certificates\") pod \"image-registry-66df7c8f76-xl7mt\" (UID: \"67f77fbb-d575-4c6d-8e82-0129cb3fb9c8\") " pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" Oct 01 12:44:15 crc kubenswrapper[4727]: I1001 12:44:15.922061 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/67f77fbb-d575-4c6d-8e82-0129cb3fb9c8-registry-tls\") pod \"image-registry-66df7c8f76-xl7mt\" (UID: \"67f77fbb-d575-4c6d-8e82-0129cb3fb9c8\") " pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" Oct 01 12:44:15 crc kubenswrapper[4727]: I1001 12:44:15.922112 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/67f77fbb-d575-4c6d-8e82-0129cb3fb9c8-bound-sa-token\") pod \"image-registry-66df7c8f76-xl7mt\" (UID: \"67f77fbb-d575-4c6d-8e82-0129cb3fb9c8\") " pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" Oct 01 12:44:15 crc kubenswrapper[4727]: I1001 12:44:15.922142 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/67f77fbb-d575-4c6d-8e82-0129cb3fb9c8-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xl7mt\" (UID: \"67f77fbb-d575-4c6d-8e82-0129cb3fb9c8\") " pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" Oct 01 12:44:15 crc kubenswrapper[4727]: I1001 12:44:15.922169 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jhqw\" (UniqueName: \"kubernetes.io/projected/67f77fbb-d575-4c6d-8e82-0129cb3fb9c8-kube-api-access-5jhqw\") pod \"image-registry-66df7c8f76-xl7mt\" (UID: \"67f77fbb-d575-4c6d-8e82-0129cb3fb9c8\") " pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" Oct 01 12:44:15 crc kubenswrapper[4727]: I1001 12:44:15.922192 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/67f77fbb-d575-4c6d-8e82-0129cb3fb9c8-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xl7mt\" (UID: \"67f77fbb-d575-4c6d-8e82-0129cb3fb9c8\") " pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" Oct 01 12:44:15 crc kubenswrapper[4727]: I1001 12:44:15.922236 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/67f77fbb-d575-4c6d-8e82-0129cb3fb9c8-trusted-ca\") pod \"image-registry-66df7c8f76-xl7mt\" (UID: \"67f77fbb-d575-4c6d-8e82-0129cb3fb9c8\") " pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" Oct 01 12:44:15 crc kubenswrapper[4727]: I1001 12:44:15.922860 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/67f77fbb-d575-4c6d-8e82-0129cb3fb9c8-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xl7mt\" (UID: \"67f77fbb-d575-4c6d-8e82-0129cb3fb9c8\") " pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" Oct 01 12:44:15 crc kubenswrapper[4727]: I1001 12:44:15.923244 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/67f77fbb-d575-4c6d-8e82-0129cb3fb9c8-registry-certificates\") pod \"image-registry-66df7c8f76-xl7mt\" (UID: \"67f77fbb-d575-4c6d-8e82-0129cb3fb9c8\") " pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" Oct 01 12:44:15 crc kubenswrapper[4727]: I1001 12:44:15.923563 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/67f77fbb-d575-4c6d-8e82-0129cb3fb9c8-trusted-ca\") pod \"image-registry-66df7c8f76-xl7mt\" (UID: \"67f77fbb-d575-4c6d-8e82-0129cb3fb9c8\") " pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" Oct 01 12:44:15 crc kubenswrapper[4727]: I1001 12:44:15.927982 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/67f77fbb-d575-4c6d-8e82-0129cb3fb9c8-registry-tls\") pod \"image-registry-66df7c8f76-xl7mt\" (UID: \"67f77fbb-d575-4c6d-8e82-0129cb3fb9c8\") " pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" Oct 01 12:44:15 crc kubenswrapper[4727]: I1001 12:44:15.929044 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/67f77fbb-d575-4c6d-8e82-0129cb3fb9c8-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xl7mt\" (UID: \"67f77fbb-d575-4c6d-8e82-0129cb3fb9c8\") " pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" Oct 01 12:44:15 crc kubenswrapper[4727]: I1001 12:44:15.942399 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jhqw\" (UniqueName: \"kubernetes.io/projected/67f77fbb-d575-4c6d-8e82-0129cb3fb9c8-kube-api-access-5jhqw\") pod \"image-registry-66df7c8f76-xl7mt\" (UID: \"67f77fbb-d575-4c6d-8e82-0129cb3fb9c8\") " pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" Oct 01 12:44:15 crc kubenswrapper[4727]: I1001 12:44:15.943696 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/67f77fbb-d575-4c6d-8e82-0129cb3fb9c8-bound-sa-token\") pod \"image-registry-66df7c8f76-xl7mt\" (UID: \"67f77fbb-d575-4c6d-8e82-0129cb3fb9c8\") " pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" Oct 01 12:44:16 crc kubenswrapper[4727]: I1001 12:44:16.006094 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" Oct 01 12:44:16 crc kubenswrapper[4727]: I1001 12:44:16.275987 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xl7mt"] Oct 01 12:44:16 crc kubenswrapper[4727]: I1001 12:44:16.605239 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" event={"ID":"67f77fbb-d575-4c6d-8e82-0129cb3fb9c8","Type":"ContainerStarted","Data":"9d8fe7468fdcf974c00b577afbb7bc5a99121a552118cad689476953bc47cff7"} Oct 01 12:44:16 crc kubenswrapper[4727]: I1001 12:44:16.605292 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" event={"ID":"67f77fbb-d575-4c6d-8e82-0129cb3fb9c8","Type":"ContainerStarted","Data":"4ee06b745280e737d7ea05c3c287b967835e5ee0086a887a2c0eb41f9aa80670"} Oct 01 12:44:16 crc kubenswrapper[4727]: I1001 12:44:16.605915 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" Oct 01 12:44:16 crc kubenswrapper[4727]: I1001 12:44:16.630215 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" podStartSLOduration=1.630193973 podStartE2EDuration="1.630193973s" podCreationTimestamp="2025-10-01 12:44:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:44:16.628620562 +0000 UTC m=+434.949975409" watchObservedRunningTime="2025-10-01 12:44:16.630193973 +0000 UTC m=+434.951548810" Oct 01 12:44:36 crc kubenswrapper[4727]: I1001 12:44:36.012292 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-xl7mt" Oct 01 12:44:36 crc kubenswrapper[4727]: I1001 12:44:36.086450 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8lp9x"] Oct 01 12:45:00 crc kubenswrapper[4727]: I1001 12:45:00.154552 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322045-kvjfb"] Oct 01 12:45:00 crc kubenswrapper[4727]: I1001 12:45:00.159258 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-kvjfb" Oct 01 12:45:00 crc kubenswrapper[4727]: I1001 12:45:00.159913 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322045-kvjfb"] Oct 01 12:45:00 crc kubenswrapper[4727]: I1001 12:45:00.162389 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 12:45:00 crc kubenswrapper[4727]: I1001 12:45:00.163307 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 12:45:00 crc kubenswrapper[4727]: I1001 12:45:00.279337 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27f42620-d40e-4d97-a653-e9e0dab1a53c-config-volume\") pod \"collect-profiles-29322045-kvjfb\" (UID: \"27f42620-d40e-4d97-a653-e9e0dab1a53c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-kvjfb" Oct 01 12:45:00 crc kubenswrapper[4727]: I1001 12:45:00.279586 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96phz\" (UniqueName: \"kubernetes.io/projected/27f42620-d40e-4d97-a653-e9e0dab1a53c-kube-api-access-96phz\") pod \"collect-profiles-29322045-kvjfb\" (UID: \"27f42620-d40e-4d97-a653-e9e0dab1a53c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-kvjfb" Oct 01 12:45:00 crc kubenswrapper[4727]: I1001 12:45:00.279663 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27f42620-d40e-4d97-a653-e9e0dab1a53c-secret-volume\") pod \"collect-profiles-29322045-kvjfb\" (UID: \"27f42620-d40e-4d97-a653-e9e0dab1a53c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-kvjfb" Oct 01 12:45:00 crc kubenswrapper[4727]: I1001 12:45:00.380412 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96phz\" (UniqueName: \"kubernetes.io/projected/27f42620-d40e-4d97-a653-e9e0dab1a53c-kube-api-access-96phz\") pod \"collect-profiles-29322045-kvjfb\" (UID: \"27f42620-d40e-4d97-a653-e9e0dab1a53c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-kvjfb" Oct 01 12:45:00 crc kubenswrapper[4727]: I1001 12:45:00.380470 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27f42620-d40e-4d97-a653-e9e0dab1a53c-secret-volume\") pod \"collect-profiles-29322045-kvjfb\" (UID: \"27f42620-d40e-4d97-a653-e9e0dab1a53c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-kvjfb" Oct 01 12:45:00 crc kubenswrapper[4727]: I1001 12:45:00.380526 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27f42620-d40e-4d97-a653-e9e0dab1a53c-config-volume\") pod \"collect-profiles-29322045-kvjfb\" (UID: \"27f42620-d40e-4d97-a653-e9e0dab1a53c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-kvjfb" Oct 01 12:45:00 crc kubenswrapper[4727]: I1001 12:45:00.381552 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27f42620-d40e-4d97-a653-e9e0dab1a53c-config-volume\") pod \"collect-profiles-29322045-kvjfb\" (UID: \"27f42620-d40e-4d97-a653-e9e0dab1a53c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-kvjfb" Oct 01 12:45:00 crc kubenswrapper[4727]: I1001 12:45:00.387759 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27f42620-d40e-4d97-a653-e9e0dab1a53c-secret-volume\") pod \"collect-profiles-29322045-kvjfb\" (UID: \"27f42620-d40e-4d97-a653-e9e0dab1a53c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-kvjfb" Oct 01 12:45:00 crc kubenswrapper[4727]: I1001 12:45:00.403869 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96phz\" (UniqueName: \"kubernetes.io/projected/27f42620-d40e-4d97-a653-e9e0dab1a53c-kube-api-access-96phz\") pod \"collect-profiles-29322045-kvjfb\" (UID: \"27f42620-d40e-4d97-a653-e9e0dab1a53c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-kvjfb" Oct 01 12:45:00 crc kubenswrapper[4727]: I1001 12:45:00.483630 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-kvjfb" Oct 01 12:45:00 crc kubenswrapper[4727]: I1001 12:45:00.681878 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322045-kvjfb"] Oct 01 12:45:00 crc kubenswrapper[4727]: I1001 12:45:00.883917 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-kvjfb" event={"ID":"27f42620-d40e-4d97-a653-e9e0dab1a53c","Type":"ContainerStarted","Data":"1fe0c1af7a161a0cd62c4804d811c3d873da79bdd64acee98fcc0e26024e5d35"} Oct 01 12:45:00 crc kubenswrapper[4727]: I1001 12:45:00.883976 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-kvjfb" event={"ID":"27f42620-d40e-4d97-a653-e9e0dab1a53c","Type":"ContainerStarted","Data":"474818afdadb13f783643d854dbeef57ca06d7c207c00a71ea02966954616ee1"} Oct 01 12:45:00 crc kubenswrapper[4727]: I1001 12:45:00.903870 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-kvjfb" podStartSLOduration=0.903853529 podStartE2EDuration="903.853529ms" podCreationTimestamp="2025-10-01 12:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:45:00.901502975 +0000 UTC m=+479.222857812" watchObservedRunningTime="2025-10-01 12:45:00.903853529 +0000 UTC m=+479.225208366" Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.143593 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" podUID="73638d71-c9ed-4ad0-866d-67c36b52de3e" containerName="registry" containerID="cri-o://c08d6a910be048b36cf711a44b6fa49d645e0b3da2edbb22eb72a411b3abc79c" gracePeriod=30 Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.540660 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.596163 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73638d71-c9ed-4ad0-866d-67c36b52de3e-bound-sa-token\") pod \"73638d71-c9ed-4ad0-866d-67c36b52de3e\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.606601 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73638d71-c9ed-4ad0-866d-67c36b52de3e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "73638d71-c9ed-4ad0-866d-67c36b52de3e" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.697303 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/73638d71-c9ed-4ad0-866d-67c36b52de3e-registry-certificates\") pod \"73638d71-c9ed-4ad0-866d-67c36b52de3e\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.697375 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73638d71-c9ed-4ad0-866d-67c36b52de3e-registry-tls\") pod \"73638d71-c9ed-4ad0-866d-67c36b52de3e\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.697458 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/73638d71-c9ed-4ad0-866d-67c36b52de3e-ca-trust-extracted\") pod \"73638d71-c9ed-4ad0-866d-67c36b52de3e\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.697504 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73638d71-c9ed-4ad0-866d-67c36b52de3e-trusted-ca\") pod \"73638d71-c9ed-4ad0-866d-67c36b52de3e\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.697763 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"73638d71-c9ed-4ad0-866d-67c36b52de3e\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.697835 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6x4g\" (UniqueName: \"kubernetes.io/projected/73638d71-c9ed-4ad0-866d-67c36b52de3e-kube-api-access-b6x4g\") pod \"73638d71-c9ed-4ad0-866d-67c36b52de3e\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.697875 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/73638d71-c9ed-4ad0-866d-67c36b52de3e-installation-pull-secrets\") pod \"73638d71-c9ed-4ad0-866d-67c36b52de3e\" (UID: \"73638d71-c9ed-4ad0-866d-67c36b52de3e\") " Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.698181 4727 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73638d71-c9ed-4ad0-866d-67c36b52de3e-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.698771 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73638d71-c9ed-4ad0-866d-67c36b52de3e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "73638d71-c9ed-4ad0-866d-67c36b52de3e" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.699160 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73638d71-c9ed-4ad0-866d-67c36b52de3e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "73638d71-c9ed-4ad0-866d-67c36b52de3e" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.702715 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73638d71-c9ed-4ad0-866d-67c36b52de3e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "73638d71-c9ed-4ad0-866d-67c36b52de3e" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.702784 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73638d71-c9ed-4ad0-866d-67c36b52de3e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "73638d71-c9ed-4ad0-866d-67c36b52de3e" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.704375 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73638d71-c9ed-4ad0-866d-67c36b52de3e-kube-api-access-b6x4g" (OuterVolumeSpecName: "kube-api-access-b6x4g") pod "73638d71-c9ed-4ad0-866d-67c36b52de3e" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e"). InnerVolumeSpecName "kube-api-access-b6x4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.712831 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "73638d71-c9ed-4ad0-866d-67c36b52de3e" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.717046 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73638d71-c9ed-4ad0-866d-67c36b52de3e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "73638d71-c9ed-4ad0-866d-67c36b52de3e" (UID: "73638d71-c9ed-4ad0-866d-67c36b52de3e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.800306 4727 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/73638d71-c9ed-4ad0-866d-67c36b52de3e-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.800368 4727 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73638d71-c9ed-4ad0-866d-67c36b52de3e-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.800383 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6x4g\" (UniqueName: \"kubernetes.io/projected/73638d71-c9ed-4ad0-866d-67c36b52de3e-kube-api-access-b6x4g\") on node \"crc\" DevicePath \"\"" Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.800403 4727 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/73638d71-c9ed-4ad0-866d-67c36b52de3e-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.800418 4727 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/73638d71-c9ed-4ad0-866d-67c36b52de3e-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.800431 4727 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73638d71-c9ed-4ad0-866d-67c36b52de3e-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.892152 4727 generic.go:334] "Generic (PLEG): container finished" podID="73638d71-c9ed-4ad0-866d-67c36b52de3e" containerID="c08d6a910be048b36cf711a44b6fa49d645e0b3da2edbb22eb72a411b3abc79c" exitCode=0 Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.892238 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.892251 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" event={"ID":"73638d71-c9ed-4ad0-866d-67c36b52de3e","Type":"ContainerDied","Data":"c08d6a910be048b36cf711a44b6fa49d645e0b3da2edbb22eb72a411b3abc79c"} Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.892364 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8lp9x" event={"ID":"73638d71-c9ed-4ad0-866d-67c36b52de3e","Type":"ContainerDied","Data":"136dc78359478b3de167a0d296a7cc0d7f1725f7ebe4995b9e462639813d9f84"} Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.892386 4727 scope.go:117] "RemoveContainer" containerID="c08d6a910be048b36cf711a44b6fa49d645e0b3da2edbb22eb72a411b3abc79c" Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.895380 4727 generic.go:334] "Generic (PLEG): container finished" podID="27f42620-d40e-4d97-a653-e9e0dab1a53c" containerID="1fe0c1af7a161a0cd62c4804d811c3d873da79bdd64acee98fcc0e26024e5d35" exitCode=0 Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.895436 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-kvjfb" event={"ID":"27f42620-d40e-4d97-a653-e9e0dab1a53c","Type":"ContainerDied","Data":"1fe0c1af7a161a0cd62c4804d811c3d873da79bdd64acee98fcc0e26024e5d35"} Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.922375 4727 scope.go:117] "RemoveContainer" containerID="c08d6a910be048b36cf711a44b6fa49d645e0b3da2edbb22eb72a411b3abc79c" Oct 01 12:45:01 crc kubenswrapper[4727]: E1001 12:45:01.923011 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c08d6a910be048b36cf711a44b6fa49d645e0b3da2edbb22eb72a411b3abc79c\": container with ID starting with c08d6a910be048b36cf711a44b6fa49d645e0b3da2edbb22eb72a411b3abc79c not found: ID does not exist" containerID="c08d6a910be048b36cf711a44b6fa49d645e0b3da2edbb22eb72a411b3abc79c" Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.923058 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c08d6a910be048b36cf711a44b6fa49d645e0b3da2edbb22eb72a411b3abc79c"} err="failed to get container status \"c08d6a910be048b36cf711a44b6fa49d645e0b3da2edbb22eb72a411b3abc79c\": rpc error: code = NotFound desc = could not find container \"c08d6a910be048b36cf711a44b6fa49d645e0b3da2edbb22eb72a411b3abc79c\": container with ID starting with c08d6a910be048b36cf711a44b6fa49d645e0b3da2edbb22eb72a411b3abc79c not found: ID does not exist" Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.937724 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8lp9x"] Oct 01 12:45:01 crc kubenswrapper[4727]: I1001 12:45:01.944246 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8lp9x"] Oct 01 12:45:02 crc kubenswrapper[4727]: I1001 12:45:02.381280 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73638d71-c9ed-4ad0-866d-67c36b52de3e" path="/var/lib/kubelet/pods/73638d71-c9ed-4ad0-866d-67c36b52de3e/volumes" Oct 01 12:45:03 crc kubenswrapper[4727]: I1001 12:45:03.150177 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-kvjfb" Oct 01 12:45:03 crc kubenswrapper[4727]: I1001 12:45:03.319674 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27f42620-d40e-4d97-a653-e9e0dab1a53c-secret-volume\") pod \"27f42620-d40e-4d97-a653-e9e0dab1a53c\" (UID: \"27f42620-d40e-4d97-a653-e9e0dab1a53c\") " Oct 01 12:45:03 crc kubenswrapper[4727]: I1001 12:45:03.319724 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27f42620-d40e-4d97-a653-e9e0dab1a53c-config-volume\") pod \"27f42620-d40e-4d97-a653-e9e0dab1a53c\" (UID: \"27f42620-d40e-4d97-a653-e9e0dab1a53c\") " Oct 01 12:45:03 crc kubenswrapper[4727]: I1001 12:45:03.319784 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96phz\" (UniqueName: \"kubernetes.io/projected/27f42620-d40e-4d97-a653-e9e0dab1a53c-kube-api-access-96phz\") pod \"27f42620-d40e-4d97-a653-e9e0dab1a53c\" (UID: \"27f42620-d40e-4d97-a653-e9e0dab1a53c\") " Oct 01 12:45:03 crc kubenswrapper[4727]: I1001 12:45:03.320474 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27f42620-d40e-4d97-a653-e9e0dab1a53c-config-volume" (OuterVolumeSpecName: "config-volume") pod "27f42620-d40e-4d97-a653-e9e0dab1a53c" (UID: "27f42620-d40e-4d97-a653-e9e0dab1a53c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:45:03 crc kubenswrapper[4727]: I1001 12:45:03.324697 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27f42620-d40e-4d97-a653-e9e0dab1a53c-kube-api-access-96phz" (OuterVolumeSpecName: "kube-api-access-96phz") pod "27f42620-d40e-4d97-a653-e9e0dab1a53c" (UID: "27f42620-d40e-4d97-a653-e9e0dab1a53c"). InnerVolumeSpecName "kube-api-access-96phz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:45:03 crc kubenswrapper[4727]: I1001 12:45:03.325164 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27f42620-d40e-4d97-a653-e9e0dab1a53c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "27f42620-d40e-4d97-a653-e9e0dab1a53c" (UID: "27f42620-d40e-4d97-a653-e9e0dab1a53c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:45:03 crc kubenswrapper[4727]: I1001 12:45:03.421731 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96phz\" (UniqueName: \"kubernetes.io/projected/27f42620-d40e-4d97-a653-e9e0dab1a53c-kube-api-access-96phz\") on node \"crc\" DevicePath \"\"" Oct 01 12:45:03 crc kubenswrapper[4727]: I1001 12:45:03.422071 4727 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27f42620-d40e-4d97-a653-e9e0dab1a53c-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 12:45:03 crc kubenswrapper[4727]: I1001 12:45:03.422185 4727 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27f42620-d40e-4d97-a653-e9e0dab1a53c-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 12:45:03 crc kubenswrapper[4727]: I1001 12:45:03.908837 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-kvjfb" event={"ID":"27f42620-d40e-4d97-a653-e9e0dab1a53c","Type":"ContainerDied","Data":"474818afdadb13f783643d854dbeef57ca06d7c207c00a71ea02966954616ee1"} Oct 01 12:45:03 crc kubenswrapper[4727]: I1001 12:45:03.908890 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="474818afdadb13f783643d854dbeef57ca06d7c207c00a71ea02966954616ee1" Oct 01 12:45:03 crc kubenswrapper[4727]: I1001 12:45:03.909214 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322045-kvjfb" Oct 01 12:45:33 crc kubenswrapper[4727]: I1001 12:45:33.292204 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:45:33 crc kubenswrapper[4727]: I1001 12:45:33.293272 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:46:03 crc kubenswrapper[4727]: I1001 12:46:03.291378 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:46:03 crc kubenswrapper[4727]: I1001 12:46:03.292127 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.180375 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-xh4gb"] Oct 01 12:46:25 crc kubenswrapper[4727]: E1001 12:46:25.181390 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73638d71-c9ed-4ad0-866d-67c36b52de3e" containerName="registry" Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.181407 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="73638d71-c9ed-4ad0-866d-67c36b52de3e" containerName="registry" Oct 01 12:46:25 crc kubenswrapper[4727]: E1001 12:46:25.181425 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f42620-d40e-4d97-a653-e9e0dab1a53c" containerName="collect-profiles" Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.181432 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f42620-d40e-4d97-a653-e9e0dab1a53c" containerName="collect-profiles" Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.181572 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="73638d71-c9ed-4ad0-866d-67c36b52de3e" containerName="registry" Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.181586 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="27f42620-d40e-4d97-a653-e9e0dab1a53c" containerName="collect-profiles" Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.182088 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-xh4gb" Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.188594 4727 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-wpb9g" Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.188594 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.188787 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.193438 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-28n2t"] Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.194319 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-28n2t" Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.197331 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-xh4gb"] Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.201685 4727 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-gjkg9" Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.209473 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-28n2t"] Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.227359 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-sbfpp"] Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.228020 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-sbfpp" Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.229768 4727 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-bgjmw" Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.238174 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-sbfpp"] Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.281774 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwlpl\" (UniqueName: \"kubernetes.io/projected/81c18dc6-8f66-4bc2-9b57-867976fab5d8-kube-api-access-xwlpl\") pod \"cert-manager-5b446d88c5-28n2t\" (UID: \"81c18dc6-8f66-4bc2-9b57-867976fab5d8\") " pod="cert-manager/cert-manager-5b446d88c5-28n2t" Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.281844 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pj65\" (UniqueName: \"kubernetes.io/projected/c61e6a7c-4c30-46aa-a082-53ac21575230-kube-api-access-9pj65\") pod \"cert-manager-cainjector-7f985d654d-xh4gb\" (UID: \"c61e6a7c-4c30-46aa-a082-53ac21575230\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-xh4gb" Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.281902 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md95q\" (UniqueName: \"kubernetes.io/projected/5a9acff3-95f2-4c83-84a8-abe0aa97789c-kube-api-access-md95q\") pod \"cert-manager-webhook-5655c58dd6-sbfpp\" (UID: \"5a9acff3-95f2-4c83-84a8-abe0aa97789c\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-sbfpp" Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.382738 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pj65\" (UniqueName: \"kubernetes.io/projected/c61e6a7c-4c30-46aa-a082-53ac21575230-kube-api-access-9pj65\") pod \"cert-manager-cainjector-7f985d654d-xh4gb\" (UID: \"c61e6a7c-4c30-46aa-a082-53ac21575230\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-xh4gb" Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.382821 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md95q\" (UniqueName: \"kubernetes.io/projected/5a9acff3-95f2-4c83-84a8-abe0aa97789c-kube-api-access-md95q\") pod \"cert-manager-webhook-5655c58dd6-sbfpp\" (UID: \"5a9acff3-95f2-4c83-84a8-abe0aa97789c\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-sbfpp" Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.382882 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwlpl\" (UniqueName: \"kubernetes.io/projected/81c18dc6-8f66-4bc2-9b57-867976fab5d8-kube-api-access-xwlpl\") pod \"cert-manager-5b446d88c5-28n2t\" (UID: \"81c18dc6-8f66-4bc2-9b57-867976fab5d8\") " pod="cert-manager/cert-manager-5b446d88c5-28n2t" Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.405654 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pj65\" (UniqueName: \"kubernetes.io/projected/c61e6a7c-4c30-46aa-a082-53ac21575230-kube-api-access-9pj65\") pod \"cert-manager-cainjector-7f985d654d-xh4gb\" (UID: \"c61e6a7c-4c30-46aa-a082-53ac21575230\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-xh4gb" Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.410165 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md95q\" (UniqueName: \"kubernetes.io/projected/5a9acff3-95f2-4c83-84a8-abe0aa97789c-kube-api-access-md95q\") pod \"cert-manager-webhook-5655c58dd6-sbfpp\" (UID: \"5a9acff3-95f2-4c83-84a8-abe0aa97789c\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-sbfpp" Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.411153 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwlpl\" (UniqueName: \"kubernetes.io/projected/81c18dc6-8f66-4bc2-9b57-867976fab5d8-kube-api-access-xwlpl\") pod \"cert-manager-5b446d88c5-28n2t\" (UID: \"81c18dc6-8f66-4bc2-9b57-867976fab5d8\") " pod="cert-manager/cert-manager-5b446d88c5-28n2t" Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.507074 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-xh4gb" Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.516535 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-28n2t" Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.544062 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-sbfpp" Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.741280 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-xh4gb"] Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.753369 4727 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.981179 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-sbfpp"] Oct 01 12:46:25 crc kubenswrapper[4727]: I1001 12:46:25.998168 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-28n2t"] Oct 01 12:46:26 crc kubenswrapper[4727]: W1001 12:46:26.003665 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81c18dc6_8f66_4bc2_9b57_867976fab5d8.slice/crio-f8959792fd5c0391f5e100aa04b78bc395ae6faf3114540bc51c7aa2bc0c8215 WatchSource:0}: Error finding container f8959792fd5c0391f5e100aa04b78bc395ae6faf3114540bc51c7aa2bc0c8215: Status 404 returned error can't find the container with id f8959792fd5c0391f5e100aa04b78bc395ae6faf3114540bc51c7aa2bc0c8215 Oct 01 12:46:26 crc kubenswrapper[4727]: I1001 12:46:26.400301 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-28n2t" event={"ID":"81c18dc6-8f66-4bc2-9b57-867976fab5d8","Type":"ContainerStarted","Data":"f8959792fd5c0391f5e100aa04b78bc395ae6faf3114540bc51c7aa2bc0c8215"} Oct 01 12:46:26 crc kubenswrapper[4727]: I1001 12:46:26.401210 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-sbfpp" event={"ID":"5a9acff3-95f2-4c83-84a8-abe0aa97789c","Type":"ContainerStarted","Data":"45c894c620f9df72971b1d0bc74bdb24fa743cfde35d62736cc48998a34fbae7"} Oct 01 12:46:26 crc kubenswrapper[4727]: I1001 12:46:26.402108 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-xh4gb" event={"ID":"c61e6a7c-4c30-46aa-a082-53ac21575230","Type":"ContainerStarted","Data":"872298c6a3c46f615afd71b808aba6b7d3b676d1ab64f49d5607ee680237fdaf"} Oct 01 12:46:28 crc kubenswrapper[4727]: I1001 12:46:28.425377 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-xh4gb" event={"ID":"c61e6a7c-4c30-46aa-a082-53ac21575230","Type":"ContainerStarted","Data":"c3fa9819c5ae9f02b3109924a789be8e1d27f586a75b79b67db9035c497489e9"} Oct 01 12:46:28 crc kubenswrapper[4727]: I1001 12:46:28.447653 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-xh4gb" podStartSLOduration=1.162355449 podStartE2EDuration="3.447624557s" podCreationTimestamp="2025-10-01 12:46:25 +0000 UTC" firstStartedPulling="2025-10-01 12:46:25.75312825 +0000 UTC m=+564.074483087" lastFinishedPulling="2025-10-01 12:46:28.038397338 +0000 UTC m=+566.359752195" observedRunningTime="2025-10-01 12:46:28.441781662 +0000 UTC m=+566.763136519" watchObservedRunningTime="2025-10-01 12:46:28.447624557 +0000 UTC m=+566.768979404" Oct 01 12:46:31 crc kubenswrapper[4727]: I1001 12:46:31.444490 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-sbfpp" event={"ID":"5a9acff3-95f2-4c83-84a8-abe0aa97789c","Type":"ContainerStarted","Data":"d038eda5603d5bc8f2bb1a188d8f0c3e95cca5faf5faf65636eefedd55ba6bdf"} Oct 01 12:46:31 crc kubenswrapper[4727]: I1001 12:46:31.445171 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-sbfpp" Oct 01 12:46:31 crc kubenswrapper[4727]: I1001 12:46:31.446731 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-28n2t" event={"ID":"81c18dc6-8f66-4bc2-9b57-867976fab5d8","Type":"ContainerStarted","Data":"8b5dc747525318d55529432925bf9607514676a1aaea36f619b22523c7da953b"} Oct 01 12:46:31 crc kubenswrapper[4727]: I1001 12:46:31.485273 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-sbfpp" podStartSLOduration=1.724031018 podStartE2EDuration="6.485244843s" podCreationTimestamp="2025-10-01 12:46:25 +0000 UTC" firstStartedPulling="2025-10-01 12:46:25.988826436 +0000 UTC m=+564.310181273" lastFinishedPulling="2025-10-01 12:46:30.750040261 +0000 UTC m=+569.071395098" observedRunningTime="2025-10-01 12:46:31.461944113 +0000 UTC m=+569.783298950" watchObservedRunningTime="2025-10-01 12:46:31.485244843 +0000 UTC m=+569.806599680" Oct 01 12:46:31 crc kubenswrapper[4727]: I1001 12:46:31.485711 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-28n2t" podStartSLOduration=1.736237868 podStartE2EDuration="6.485703148s" podCreationTimestamp="2025-10-01 12:46:25 +0000 UTC" firstStartedPulling="2025-10-01 12:46:26.006785547 +0000 UTC m=+564.328140394" lastFinishedPulling="2025-10-01 12:46:30.756250837 +0000 UTC m=+569.077605674" observedRunningTime="2025-10-01 12:46:31.477160476 +0000 UTC m=+569.798515333" watchObservedRunningTime="2025-10-01 12:46:31.485703148 +0000 UTC m=+569.807058065" Oct 01 12:46:33 crc kubenswrapper[4727]: I1001 12:46:33.292883 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:46:33 crc kubenswrapper[4727]: I1001 12:46:33.293512 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:46:33 crc kubenswrapper[4727]: I1001 12:46:33.293609 4727 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" Oct 01 12:46:33 crc kubenswrapper[4727]: I1001 12:46:33.294990 4727 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"12b4ca882a4878e5a3279395bccb6e21a4ad58217420588536e4e56fcd67eeb7"} pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 12:46:33 crc kubenswrapper[4727]: I1001 12:46:33.295243 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" containerID="cri-o://12b4ca882a4878e5a3279395bccb6e21a4ad58217420588536e4e56fcd67eeb7" gracePeriod=600 Oct 01 12:46:33 crc kubenswrapper[4727]: I1001 12:46:33.464243 4727 generic.go:334] "Generic (PLEG): container finished" podID="d18290ae-64a5-44a5-a704-90977d85852b" containerID="12b4ca882a4878e5a3279395bccb6e21a4ad58217420588536e4e56fcd67eeb7" exitCode=0 Oct 01 12:46:33 crc kubenswrapper[4727]: I1001 12:46:33.464298 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" event={"ID":"d18290ae-64a5-44a5-a704-90977d85852b","Type":"ContainerDied","Data":"12b4ca882a4878e5a3279395bccb6e21a4ad58217420588536e4e56fcd67eeb7"} Oct 01 12:46:33 crc kubenswrapper[4727]: I1001 12:46:33.464363 4727 scope.go:117] "RemoveContainer" containerID="ff528fc413a67120cbfce88f98833b8fdf8ba19775f84a05229bef0f923e8a19" Oct 01 12:46:34 crc kubenswrapper[4727]: I1001 12:46:34.475898 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" event={"ID":"d18290ae-64a5-44a5-a704-90977d85852b","Type":"ContainerStarted","Data":"5a5c4ca99360c9b81c10e0ced10d126f629e2db295e44de92257033d1fe6295f"} Oct 01 12:46:35 crc kubenswrapper[4727]: I1001 12:46:35.548396 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-sbfpp" Oct 01 12:46:35 crc kubenswrapper[4727]: I1001 12:46:35.861077 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pwx55"] Oct 01 12:46:35 crc kubenswrapper[4727]: I1001 12:46:35.861431 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="ovn-controller" containerID="cri-o://3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d" gracePeriod=30 Oct 01 12:46:35 crc kubenswrapper[4727]: I1001 12:46:35.861490 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="nbdb" containerID="cri-o://3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8" gracePeriod=30 Oct 01 12:46:35 crc kubenswrapper[4727]: I1001 12:46:35.861561 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="kube-rbac-proxy-node" containerID="cri-o://1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2" gracePeriod=30 Oct 01 12:46:35 crc kubenswrapper[4727]: I1001 12:46:35.861550 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60" gracePeriod=30 Oct 01 12:46:35 crc kubenswrapper[4727]: I1001 12:46:35.861611 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="ovn-acl-logging" containerID="cri-o://86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d" gracePeriod=30 Oct 01 12:46:35 crc kubenswrapper[4727]: I1001 12:46:35.861541 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="northd" containerID="cri-o://e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5" gracePeriod=30 Oct 01 12:46:35 crc kubenswrapper[4727]: I1001 12:46:35.861694 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="sbdb" containerID="cri-o://69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df" gracePeriod=30 Oct 01 12:46:35 crc kubenswrapper[4727]: I1001 12:46:35.904323 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="ovnkube-controller" containerID="cri-o://b2e1765b2828434b1e02dfd7ac7d9dc1358e15d8a1f0f3caba9d3b234e1cd232" gracePeriod=30 Oct 01 12:46:36 crc kubenswrapper[4727]: E1001 12:46:36.280449 4727 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8 is running failed: container process not found" containerID="3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Oct 01 12:46:36 crc kubenswrapper[4727]: E1001 12:46:36.280585 4727 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df is running failed: container process not found" containerID="69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Oct 01 12:46:36 crc kubenswrapper[4727]: E1001 12:46:36.281352 4727 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8 is running failed: container process not found" containerID="3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Oct 01 12:46:36 crc kubenswrapper[4727]: E1001 12:46:36.281392 4727 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df is running failed: container process not found" containerID="69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Oct 01 12:46:36 crc kubenswrapper[4727]: E1001 12:46:36.281813 4727 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8 is running failed: container process not found" containerID="3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Oct 01 12:46:36 crc kubenswrapper[4727]: E1001 12:46:36.281876 4727 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="nbdb" Oct 01 12:46:36 crc kubenswrapper[4727]: E1001 12:46:36.281889 4727 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df is running failed: container process not found" containerID="69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Oct 01 12:46:36 crc kubenswrapper[4727]: E1001 12:46:36.281943 4727 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="sbdb" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.488749 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwx55_a908511b-2ce2-4a11-8dad-3867bee13f57/ovnkube-controller/3.log" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.490935 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwx55_a908511b-2ce2-4a11-8dad-3867bee13f57/ovn-acl-logging/0.log" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.491322 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwx55_a908511b-2ce2-4a11-8dad-3867bee13f57/ovn-controller/0.log" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.491722 4727 generic.go:334] "Generic (PLEG): container finished" podID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerID="b2e1765b2828434b1e02dfd7ac7d9dc1358e15d8a1f0f3caba9d3b234e1cd232" exitCode=0 Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.491761 4727 generic.go:334] "Generic (PLEG): container finished" podID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerID="69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df" exitCode=0 Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.491772 4727 generic.go:334] "Generic (PLEG): container finished" podID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerID="3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8" exitCode=0 Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.491780 4727 generic.go:334] "Generic (PLEG): container finished" podID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerID="e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5" exitCode=0 Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.491790 4727 generic.go:334] "Generic (PLEG): container finished" podID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerID="d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60" exitCode=0 Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.491799 4727 generic.go:334] "Generic (PLEG): container finished" podID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerID="1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2" exitCode=0 Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.491807 4727 generic.go:334] "Generic (PLEG): container finished" podID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerID="86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d" exitCode=143 Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.491816 4727 generic.go:334] "Generic (PLEG): container finished" podID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerID="3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d" exitCode=143 Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.491837 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" event={"ID":"a908511b-2ce2-4a11-8dad-3867bee13f57","Type":"ContainerDied","Data":"b2e1765b2828434b1e02dfd7ac7d9dc1358e15d8a1f0f3caba9d3b234e1cd232"} Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.491946 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" event={"ID":"a908511b-2ce2-4a11-8dad-3867bee13f57","Type":"ContainerDied","Data":"69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df"} Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.491968 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" event={"ID":"a908511b-2ce2-4a11-8dad-3867bee13f57","Type":"ContainerDied","Data":"3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8"} Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.491987 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" event={"ID":"a908511b-2ce2-4a11-8dad-3867bee13f57","Type":"ContainerDied","Data":"e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5"} Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.492026 4727 scope.go:117] "RemoveContainer" containerID="69090b800e3d9e3cac2f5bb288478653d9be161e2aa288dc851e6cdd0acd1b57" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.492324 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" event={"ID":"a908511b-2ce2-4a11-8dad-3867bee13f57","Type":"ContainerDied","Data":"d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60"} Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.492376 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" event={"ID":"a908511b-2ce2-4a11-8dad-3867bee13f57","Type":"ContainerDied","Data":"1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2"} Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.492399 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" event={"ID":"a908511b-2ce2-4a11-8dad-3867bee13f57","Type":"ContainerDied","Data":"86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d"} Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.492655 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" event={"ID":"a908511b-2ce2-4a11-8dad-3867bee13f57","Type":"ContainerDied","Data":"3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d"} Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.493634 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-slqxs_5cf1a0b8-9119-44c6-91ea-473317335fb9/kube-multus/2.log" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.494078 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-slqxs_5cf1a0b8-9119-44c6-91ea-473317335fb9/kube-multus/1.log" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.494227 4727 generic.go:334] "Generic (PLEG): container finished" podID="5cf1a0b8-9119-44c6-91ea-473317335fb9" containerID="6e30e4ca49faf2e9ac3302ec2021e148f3abddfba6a1d82a337dee7352158388" exitCode=2 Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.494266 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-slqxs" event={"ID":"5cf1a0b8-9119-44c6-91ea-473317335fb9","Type":"ContainerDied","Data":"6e30e4ca49faf2e9ac3302ec2021e148f3abddfba6a1d82a337dee7352158388"} Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.494722 4727 scope.go:117] "RemoveContainer" containerID="6e30e4ca49faf2e9ac3302ec2021e148f3abddfba6a1d82a337dee7352158388" Oct 01 12:46:36 crc kubenswrapper[4727]: E1001 12:46:36.494937 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-slqxs_openshift-multus(5cf1a0b8-9119-44c6-91ea-473317335fb9)\"" pod="openshift-multus/multus-slqxs" podUID="5cf1a0b8-9119-44c6-91ea-473317335fb9" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.516230 4727 scope.go:117] "RemoveContainer" containerID="646bb050f901e31d33162aa5191505e91edf58a243c2dac9bf5b84e99bcebe1c" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.617045 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwx55_a908511b-2ce2-4a11-8dad-3867bee13f57/ovn-acl-logging/0.log" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.617520 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwx55_a908511b-2ce2-4a11-8dad-3867bee13f57/ovn-controller/0.log" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.618019 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.667677 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vc7bk"] Oct 01 12:46:36 crc kubenswrapper[4727]: E1001 12:46:36.667874 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="ovnkube-controller" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.667885 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="ovnkube-controller" Oct 01 12:46:36 crc kubenswrapper[4727]: E1001 12:46:36.667892 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="kubecfg-setup" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.667898 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="kubecfg-setup" Oct 01 12:46:36 crc kubenswrapper[4727]: E1001 12:46:36.667906 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="nbdb" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.667913 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="nbdb" Oct 01 12:46:36 crc kubenswrapper[4727]: E1001 12:46:36.667923 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="northd" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.667929 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="northd" Oct 01 12:46:36 crc kubenswrapper[4727]: E1001 12:46:36.667937 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="ovnkube-controller" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.667942 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="ovnkube-controller" Oct 01 12:46:36 crc kubenswrapper[4727]: E1001 12:46:36.667950 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="sbdb" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.667956 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="sbdb" Oct 01 12:46:36 crc kubenswrapper[4727]: E1001 12:46:36.667965 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="ovn-acl-logging" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.667971 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="ovn-acl-logging" Oct 01 12:46:36 crc kubenswrapper[4727]: E1001 12:46:36.667976 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="ovnkube-controller" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.667981 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="ovnkube-controller" Oct 01 12:46:36 crc kubenswrapper[4727]: E1001 12:46:36.667990 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="ovn-controller" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.668010 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="ovn-controller" Oct 01 12:46:36 crc kubenswrapper[4727]: E1001 12:46:36.668017 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="ovnkube-controller" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.668023 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="ovnkube-controller" Oct 01 12:46:36 crc kubenswrapper[4727]: E1001 12:46:36.668030 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="kube-rbac-proxy-ovn-metrics" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.668036 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="kube-rbac-proxy-ovn-metrics" Oct 01 12:46:36 crc kubenswrapper[4727]: E1001 12:46:36.668046 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="kube-rbac-proxy-node" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.668052 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="kube-rbac-proxy-node" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.668139 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="kube-rbac-proxy-ovn-metrics" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.668152 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="ovnkube-controller" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.668159 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="ovnkube-controller" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.668165 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="sbdb" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.668170 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="ovnkube-controller" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.668176 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="ovn-controller" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.668183 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="nbdb" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.668190 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="ovn-acl-logging" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.668198 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="kube-rbac-proxy-node" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.668204 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="northd" Oct 01 12:46:36 crc kubenswrapper[4727]: E1001 12:46:36.668284 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="ovnkube-controller" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.668290 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="ovnkube-controller" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.668403 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="ovnkube-controller" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.668415 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" containerName="ovnkube-controller" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.669834 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.748714 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-node-log\") pod \"a908511b-2ce2-4a11-8dad-3867bee13f57\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.748749 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-cni-netd\") pod \"a908511b-2ce2-4a11-8dad-3867bee13f57\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.748771 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-run-ovn\") pod \"a908511b-2ce2-4a11-8dad-3867bee13f57\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.748798 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a908511b-2ce2-4a11-8dad-3867bee13f57-ovnkube-script-lib\") pod \"a908511b-2ce2-4a11-8dad-3867bee13f57\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.748829 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-run-openvswitch\") pod \"a908511b-2ce2-4a11-8dad-3867bee13f57\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.748819 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-node-log" (OuterVolumeSpecName: "node-log") pod "a908511b-2ce2-4a11-8dad-3867bee13f57" (UID: "a908511b-2ce2-4a11-8dad-3867bee13f57"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.748843 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-run-netns\") pod \"a908511b-2ce2-4a11-8dad-3867bee13f57\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.748887 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "a908511b-2ce2-4a11-8dad-3867bee13f57" (UID: "a908511b-2ce2-4a11-8dad-3867bee13f57"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.748878 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "a908511b-2ce2-4a11-8dad-3867bee13f57" (UID: "a908511b-2ce2-4a11-8dad-3867bee13f57"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.748922 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "a908511b-2ce2-4a11-8dad-3867bee13f57" (UID: "a908511b-2ce2-4a11-8dad-3867bee13f57"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.748882 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "a908511b-2ce2-4a11-8dad-3867bee13f57" (UID: "a908511b-2ce2-4a11-8dad-3867bee13f57"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.748921 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-slash\") pod \"a908511b-2ce2-4a11-8dad-3867bee13f57\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.748951 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-slash" (OuterVolumeSpecName: "host-slash") pod "a908511b-2ce2-4a11-8dad-3867bee13f57" (UID: "a908511b-2ce2-4a11-8dad-3867bee13f57"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749026 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-log-socket\") pod \"a908511b-2ce2-4a11-8dad-3867bee13f57\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749056 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txq6l\" (UniqueName: \"kubernetes.io/projected/a908511b-2ce2-4a11-8dad-3867bee13f57-kube-api-access-txq6l\") pod \"a908511b-2ce2-4a11-8dad-3867bee13f57\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749077 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-cni-bin\") pod \"a908511b-2ce2-4a11-8dad-3867bee13f57\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749098 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a908511b-2ce2-4a11-8dad-3867bee13f57-ovn-node-metrics-cert\") pod \"a908511b-2ce2-4a11-8dad-3867bee13f57\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749111 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-kubelet\") pod \"a908511b-2ce2-4a11-8dad-3867bee13f57\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749129 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-run-systemd\") pod \"a908511b-2ce2-4a11-8dad-3867bee13f57\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749155 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-var-lib-cni-networks-ovn-kubernetes\") pod \"a908511b-2ce2-4a11-8dad-3867bee13f57\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749178 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-run-ovn-kubernetes\") pod \"a908511b-2ce2-4a11-8dad-3867bee13f57\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749207 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-var-lib-openvswitch\") pod \"a908511b-2ce2-4a11-8dad-3867bee13f57\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749226 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a908511b-2ce2-4a11-8dad-3867bee13f57-env-overrides\") pod \"a908511b-2ce2-4a11-8dad-3867bee13f57\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749246 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a908511b-2ce2-4a11-8dad-3867bee13f57-ovnkube-config\") pod \"a908511b-2ce2-4a11-8dad-3867bee13f57\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749263 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a908511b-2ce2-4a11-8dad-3867bee13f57-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "a908511b-2ce2-4a11-8dad-3867bee13f57" (UID: "a908511b-2ce2-4a11-8dad-3867bee13f57"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749267 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-etc-openvswitch\") pod \"a908511b-2ce2-4a11-8dad-3867bee13f57\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749290 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "a908511b-2ce2-4a11-8dad-3867bee13f57" (UID: "a908511b-2ce2-4a11-8dad-3867bee13f57"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749310 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-log-socket" (OuterVolumeSpecName: "log-socket") pod "a908511b-2ce2-4a11-8dad-3867bee13f57" (UID: "a908511b-2ce2-4a11-8dad-3867bee13f57"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749326 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-systemd-units\") pod \"a908511b-2ce2-4a11-8dad-3867bee13f57\" (UID: \"a908511b-2ce2-4a11-8dad-3867bee13f57\") " Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749623 4727 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749634 4727 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-node-log\") on node \"crc\" DevicePath \"\"" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749643 4727 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749652 4727 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749660 4727 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a908511b-2ce2-4a11-8dad-3867bee13f57-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749668 4727 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749676 4727 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749683 4727 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-slash\") on node \"crc\" DevicePath \"\"" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749691 4727 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-log-socket\") on node \"crc\" DevicePath \"\"" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749715 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "a908511b-2ce2-4a11-8dad-3867bee13f57" (UID: "a908511b-2ce2-4a11-8dad-3867bee13f57"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749738 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "a908511b-2ce2-4a11-8dad-3867bee13f57" (UID: "a908511b-2ce2-4a11-8dad-3867bee13f57"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.749755 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "a908511b-2ce2-4a11-8dad-3867bee13f57" (UID: "a908511b-2ce2-4a11-8dad-3867bee13f57"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.750184 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "a908511b-2ce2-4a11-8dad-3867bee13f57" (UID: "a908511b-2ce2-4a11-8dad-3867bee13f57"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.750265 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "a908511b-2ce2-4a11-8dad-3867bee13f57" (UID: "a908511b-2ce2-4a11-8dad-3867bee13f57"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.750226 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "a908511b-2ce2-4a11-8dad-3867bee13f57" (UID: "a908511b-2ce2-4a11-8dad-3867bee13f57"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.750707 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a908511b-2ce2-4a11-8dad-3867bee13f57-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "a908511b-2ce2-4a11-8dad-3867bee13f57" (UID: "a908511b-2ce2-4a11-8dad-3867bee13f57"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.750773 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a908511b-2ce2-4a11-8dad-3867bee13f57-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "a908511b-2ce2-4a11-8dad-3867bee13f57" (UID: "a908511b-2ce2-4a11-8dad-3867bee13f57"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.754864 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a908511b-2ce2-4a11-8dad-3867bee13f57-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "a908511b-2ce2-4a11-8dad-3867bee13f57" (UID: "a908511b-2ce2-4a11-8dad-3867bee13f57"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.754970 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a908511b-2ce2-4a11-8dad-3867bee13f57-kube-api-access-txq6l" (OuterVolumeSpecName: "kube-api-access-txq6l") pod "a908511b-2ce2-4a11-8dad-3867bee13f57" (UID: "a908511b-2ce2-4a11-8dad-3867bee13f57"). InnerVolumeSpecName "kube-api-access-txq6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.772058 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "a908511b-2ce2-4a11-8dad-3867bee13f57" (UID: "a908511b-2ce2-4a11-8dad-3867bee13f57"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.851162 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.851225 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-run-openvswitch\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.851262 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-systemd-units\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.851385 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-host-slash\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.851458 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-run-systemd\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.851498 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b85b2021-ab82-4d7f-9b51-86ab15117063-ovn-node-metrics-cert\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.851536 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b85b2021-ab82-4d7f-9b51-86ab15117063-ovnkube-script-lib\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.851566 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-host-cni-bin\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.851729 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-log-socket\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.851780 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-etc-openvswitch\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.851841 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mx5r\" (UniqueName: \"kubernetes.io/projected/b85b2021-ab82-4d7f-9b51-86ab15117063-kube-api-access-8mx5r\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.851915 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-host-kubelet\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.851974 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-node-log\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.852043 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-run-ovn\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.852075 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-host-cni-netd\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.852260 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b85b2021-ab82-4d7f-9b51-86ab15117063-ovnkube-config\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.852321 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-host-run-ovn-kubernetes\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.852358 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-host-run-netns\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.852379 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-var-lib-openvswitch\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.852402 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b85b2021-ab82-4d7f-9b51-86ab15117063-env-overrides\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.852527 4727 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.852564 4727 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.852586 4727 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.852604 4727 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.852624 4727 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a908511b-2ce2-4a11-8dad-3867bee13f57-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.852641 4727 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a908511b-2ce2-4a11-8dad-3867bee13f57-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.852653 4727 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.852669 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txq6l\" (UniqueName: \"kubernetes.io/projected/a908511b-2ce2-4a11-8dad-3867bee13f57-kube-api-access-txq6l\") on node \"crc\" DevicePath \"\"" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.852683 4727 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.852694 4727 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a908511b-2ce2-4a11-8dad-3867bee13f57-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.852706 4727 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a908511b-2ce2-4a11-8dad-3867bee13f57-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.954699 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-run-openvswitch\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.954778 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-systemd-units\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.954824 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-host-slash\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.954867 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-run-systemd\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.954878 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-run-openvswitch\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.954907 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b85b2021-ab82-4d7f-9b51-86ab15117063-ovn-node-metrics-cert\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.954878 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-systemd-units\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.954913 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-host-slash\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.954944 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-run-systemd\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.955119 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b85b2021-ab82-4d7f-9b51-86ab15117063-ovnkube-script-lib\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.955155 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-host-cni-bin\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.955195 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-log-socket\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.955228 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-etc-openvswitch\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.955254 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-host-cni-bin\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.955273 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-log-socket\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.955267 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mx5r\" (UniqueName: \"kubernetes.io/projected/b85b2021-ab82-4d7f-9b51-86ab15117063-kube-api-access-8mx5r\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.955315 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-etc-openvswitch\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.955334 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-host-kubelet\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.955377 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-node-log\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.955417 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-run-ovn\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.955439 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-host-cni-netd\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.955449 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-host-kubelet\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.955518 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-node-log\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.955521 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-host-cni-netd\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.955526 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-run-ovn\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.955558 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b85b2021-ab82-4d7f-9b51-86ab15117063-ovnkube-config\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.955614 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-host-run-ovn-kubernetes\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.955652 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-host-run-netns\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.955665 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-host-run-ovn-kubernetes\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.955697 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-var-lib-openvswitch\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.955673 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-var-lib-openvswitch\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.955722 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-host-run-netns\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.955731 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b85b2021-ab82-4d7f-9b51-86ab15117063-env-overrides\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.955824 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.955916 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b85b2021-ab82-4d7f-9b51-86ab15117063-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.956135 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b85b2021-ab82-4d7f-9b51-86ab15117063-ovnkube-script-lib\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.956178 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b85b2021-ab82-4d7f-9b51-86ab15117063-ovnkube-config\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.956187 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b85b2021-ab82-4d7f-9b51-86ab15117063-env-overrides\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.958620 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b85b2021-ab82-4d7f-9b51-86ab15117063-ovn-node-metrics-cert\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.972019 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mx5r\" (UniqueName: \"kubernetes.io/projected/b85b2021-ab82-4d7f-9b51-86ab15117063-kube-api-access-8mx5r\") pod \"ovnkube-node-vc7bk\" (UID: \"b85b2021-ab82-4d7f-9b51-86ab15117063\") " pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:36 crc kubenswrapper[4727]: I1001 12:46:36.995846 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:37 crc kubenswrapper[4727]: W1001 12:46:37.025722 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb85b2021_ab82_4d7f_9b51_86ab15117063.slice/crio-da88955aafa40df223183f480c19dd9a6d23fce8392ab48b958eb72bdb72347d WatchSource:0}: Error finding container da88955aafa40df223183f480c19dd9a6d23fce8392ab48b958eb72bdb72347d: Status 404 returned error can't find the container with id da88955aafa40df223183f480c19dd9a6d23fce8392ab48b958eb72bdb72347d Oct 01 12:46:37 crc kubenswrapper[4727]: I1001 12:46:37.505422 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwx55_a908511b-2ce2-4a11-8dad-3867bee13f57/ovn-acl-logging/0.log" Oct 01 12:46:37 crc kubenswrapper[4727]: I1001 12:46:37.506476 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwx55_a908511b-2ce2-4a11-8dad-3867bee13f57/ovn-controller/0.log" Oct 01 12:46:37 crc kubenswrapper[4727]: I1001 12:46:37.507076 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" event={"ID":"a908511b-2ce2-4a11-8dad-3867bee13f57","Type":"ContainerDied","Data":"51cd98c693b08619591dd5b354ddc00a92e7e447846a509d65c77f8dbb77dad3"} Oct 01 12:46:37 crc kubenswrapper[4727]: I1001 12:46:37.507158 4727 scope.go:117] "RemoveContainer" containerID="b2e1765b2828434b1e02dfd7ac7d9dc1358e15d8a1f0f3caba9d3b234e1cd232" Oct 01 12:46:37 crc kubenswrapper[4727]: I1001 12:46:37.507166 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pwx55" Oct 01 12:46:37 crc kubenswrapper[4727]: I1001 12:46:37.509425 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-slqxs_5cf1a0b8-9119-44c6-91ea-473317335fb9/kube-multus/2.log" Oct 01 12:46:37 crc kubenswrapper[4727]: I1001 12:46:37.511526 4727 generic.go:334] "Generic (PLEG): container finished" podID="b85b2021-ab82-4d7f-9b51-86ab15117063" containerID="43f463dfd49a4960823f7f139e60bc7e3eb95bab7c66581108a4bb89c9e30350" exitCode=0 Oct 01 12:46:37 crc kubenswrapper[4727]: I1001 12:46:37.511578 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" event={"ID":"b85b2021-ab82-4d7f-9b51-86ab15117063","Type":"ContainerDied","Data":"43f463dfd49a4960823f7f139e60bc7e3eb95bab7c66581108a4bb89c9e30350"} Oct 01 12:46:37 crc kubenswrapper[4727]: I1001 12:46:37.511605 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" event={"ID":"b85b2021-ab82-4d7f-9b51-86ab15117063","Type":"ContainerStarted","Data":"da88955aafa40df223183f480c19dd9a6d23fce8392ab48b958eb72bdb72347d"} Oct 01 12:46:37 crc kubenswrapper[4727]: I1001 12:46:37.553986 4727 scope.go:117] "RemoveContainer" containerID="69f2d5b12ab933313b5acdeedea10bfc6db7128fdd25845334c6a9ff5755d5df" Oct 01 12:46:37 crc kubenswrapper[4727]: I1001 12:46:37.561139 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pwx55"] Oct 01 12:46:37 crc kubenswrapper[4727]: I1001 12:46:37.566691 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pwx55"] Oct 01 12:46:37 crc kubenswrapper[4727]: I1001 12:46:37.585158 4727 scope.go:117] "RemoveContainer" containerID="3ceb8cf5c9a9dd1bff9b4daaafdf904b711ffd987d8610fb42c5481ae2d9aee8" Oct 01 12:46:37 crc kubenswrapper[4727]: I1001 12:46:37.605049 4727 scope.go:117] "RemoveContainer" containerID="e1e931e828ce88a1bd31f19daf58e0b5120dd6fcc707ee0c794dd5f7616e9da5" Oct 01 12:46:37 crc kubenswrapper[4727]: I1001 12:46:37.630964 4727 scope.go:117] "RemoveContainer" containerID="d53f4e03b88291f43a17bcdac384deae568bc6e97b7c3324765e3e68e8cdcf60" Oct 01 12:46:37 crc kubenswrapper[4727]: I1001 12:46:37.650733 4727 scope.go:117] "RemoveContainer" containerID="1acf37b6ca97576284823c27d64e238278e365e765d77c74dd698933063b5de2" Oct 01 12:46:37 crc kubenswrapper[4727]: I1001 12:46:37.671892 4727 scope.go:117] "RemoveContainer" containerID="86be2de807dc63bc549d068336ef7c643e28051f22ef1a90f5d3a413cc34ee0d" Oct 01 12:46:37 crc kubenswrapper[4727]: I1001 12:46:37.687191 4727 scope.go:117] "RemoveContainer" containerID="3ddd80e7d051beefff138c37b82f1f6792e6d5a0dba178f1ddbc2b075282db8d" Oct 01 12:46:37 crc kubenswrapper[4727]: I1001 12:46:37.706087 4727 scope.go:117] "RemoveContainer" containerID="d003ad0d19cb5897404200a3813217d79ab060658dc812eb5479d3f3c19b10a6" Oct 01 12:46:38 crc kubenswrapper[4727]: I1001 12:46:38.381664 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a908511b-2ce2-4a11-8dad-3867bee13f57" path="/var/lib/kubelet/pods/a908511b-2ce2-4a11-8dad-3867bee13f57/volumes" Oct 01 12:46:38 crc kubenswrapper[4727]: I1001 12:46:38.519764 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" event={"ID":"b85b2021-ab82-4d7f-9b51-86ab15117063","Type":"ContainerStarted","Data":"cb106474b3546da5bf0b7c04c8b23721e4957039fb5c6c408979646c88c61665"} Oct 01 12:46:38 crc kubenswrapper[4727]: I1001 12:46:38.519852 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" event={"ID":"b85b2021-ab82-4d7f-9b51-86ab15117063","Type":"ContainerStarted","Data":"5b76400f34fff1dae4e7bb6c14bff301d419c1764b48b86c404cd60417fa7bfc"} Oct 01 12:46:38 crc kubenswrapper[4727]: I1001 12:46:38.519886 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" event={"ID":"b85b2021-ab82-4d7f-9b51-86ab15117063","Type":"ContainerStarted","Data":"eb45eccb34afdec322b4e8ef88da66d199c37437e3a521524e76224500d1feec"} Oct 01 12:46:38 crc kubenswrapper[4727]: I1001 12:46:38.519912 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" event={"ID":"b85b2021-ab82-4d7f-9b51-86ab15117063","Type":"ContainerStarted","Data":"95c8adc94674269057c20bdf46acf142f00420e1fdea973101820d47be6b2d89"} Oct 01 12:46:38 crc kubenswrapper[4727]: I1001 12:46:38.519938 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" event={"ID":"b85b2021-ab82-4d7f-9b51-86ab15117063","Type":"ContainerStarted","Data":"8e51c6d69c990c4a9bf03fde87f49ce974e8423ee330f13e8489a101aa55f3bf"} Oct 01 12:46:38 crc kubenswrapper[4727]: I1001 12:46:38.519962 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" event={"ID":"b85b2021-ab82-4d7f-9b51-86ab15117063","Type":"ContainerStarted","Data":"1ceaf48454d742acfe7574e5216bd43f1ab66395e6853d4c5bbee5f322ac1d15"} Oct 01 12:46:40 crc kubenswrapper[4727]: I1001 12:46:40.538190 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" event={"ID":"b85b2021-ab82-4d7f-9b51-86ab15117063","Type":"ContainerStarted","Data":"f98ac4a92331619218a79899b3765afebbb73fa451527ca16f2f8dcfc1ab36e8"} Oct 01 12:46:43 crc kubenswrapper[4727]: I1001 12:46:43.560456 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" event={"ID":"b85b2021-ab82-4d7f-9b51-86ab15117063","Type":"ContainerStarted","Data":"edda049990bb8de05a3803e03e30823a158a05dd725f2a33cf27bb67ef3e7f07"} Oct 01 12:46:43 crc kubenswrapper[4727]: I1001 12:46:43.562235 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:43 crc kubenswrapper[4727]: I1001 12:46:43.562356 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:43 crc kubenswrapper[4727]: I1001 12:46:43.592740 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" podStartSLOduration=7.592723491 podStartE2EDuration="7.592723491s" podCreationTimestamp="2025-10-01 12:46:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:46:43.590437748 +0000 UTC m=+581.911792595" watchObservedRunningTime="2025-10-01 12:46:43.592723491 +0000 UTC m=+581.914078328" Oct 01 12:46:43 crc kubenswrapper[4727]: I1001 12:46:43.593108 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:44 crc kubenswrapper[4727]: I1001 12:46:44.567449 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:44 crc kubenswrapper[4727]: I1001 12:46:44.595673 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:46:51 crc kubenswrapper[4727]: I1001 12:46:51.374118 4727 scope.go:117] "RemoveContainer" containerID="6e30e4ca49faf2e9ac3302ec2021e148f3abddfba6a1d82a337dee7352158388" Oct 01 12:46:51 crc kubenswrapper[4727]: E1001 12:46:51.375457 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-slqxs_openshift-multus(5cf1a0b8-9119-44c6-91ea-473317335fb9)\"" pod="openshift-multus/multus-slqxs" podUID="5cf1a0b8-9119-44c6-91ea-473317335fb9" Oct 01 12:47:03 crc kubenswrapper[4727]: I1001 12:47:03.373318 4727 scope.go:117] "RemoveContainer" containerID="6e30e4ca49faf2e9ac3302ec2021e148f3abddfba6a1d82a337dee7352158388" Oct 01 12:47:03 crc kubenswrapper[4727]: I1001 12:47:03.682550 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-slqxs_5cf1a0b8-9119-44c6-91ea-473317335fb9/kube-multus/2.log" Oct 01 12:47:03 crc kubenswrapper[4727]: I1001 12:47:03.683050 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-slqxs" event={"ID":"5cf1a0b8-9119-44c6-91ea-473317335fb9","Type":"ContainerStarted","Data":"4b29d961f324194a586ffdd3924ce87ed711ad3f1c731f27710578718821372b"} Oct 01 12:47:07 crc kubenswrapper[4727]: I1001 12:47:07.030383 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vc7bk" Oct 01 12:47:16 crc kubenswrapper[4727]: I1001 12:47:16.046798 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q"] Oct 01 12:47:16 crc kubenswrapper[4727]: I1001 12:47:16.048466 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q" Oct 01 12:47:16 crc kubenswrapper[4727]: I1001 12:47:16.050703 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 01 12:47:16 crc kubenswrapper[4727]: I1001 12:47:16.059807 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q"] Oct 01 12:47:16 crc kubenswrapper[4727]: I1001 12:47:16.160762 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd7gq\" (UniqueName: \"kubernetes.io/projected/684a10a2-03a1-405b-991c-a8aa282ac6ef-kube-api-access-zd7gq\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q\" (UID: \"684a10a2-03a1-405b-991c-a8aa282ac6ef\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q" Oct 01 12:47:16 crc kubenswrapper[4727]: I1001 12:47:16.161106 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/684a10a2-03a1-405b-991c-a8aa282ac6ef-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q\" (UID: \"684a10a2-03a1-405b-991c-a8aa282ac6ef\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q" Oct 01 12:47:16 crc kubenswrapper[4727]: I1001 12:47:16.161250 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/684a10a2-03a1-405b-991c-a8aa282ac6ef-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q\" (UID: \"684a10a2-03a1-405b-991c-a8aa282ac6ef\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q" Oct 01 12:47:16 crc kubenswrapper[4727]: I1001 12:47:16.262130 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd7gq\" (UniqueName: \"kubernetes.io/projected/684a10a2-03a1-405b-991c-a8aa282ac6ef-kube-api-access-zd7gq\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q\" (UID: \"684a10a2-03a1-405b-991c-a8aa282ac6ef\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q" Oct 01 12:47:16 crc kubenswrapper[4727]: I1001 12:47:16.262214 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/684a10a2-03a1-405b-991c-a8aa282ac6ef-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q\" (UID: \"684a10a2-03a1-405b-991c-a8aa282ac6ef\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q" Oct 01 12:47:16 crc kubenswrapper[4727]: I1001 12:47:16.262277 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/684a10a2-03a1-405b-991c-a8aa282ac6ef-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q\" (UID: \"684a10a2-03a1-405b-991c-a8aa282ac6ef\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q" Oct 01 12:47:16 crc kubenswrapper[4727]: I1001 12:47:16.262899 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/684a10a2-03a1-405b-991c-a8aa282ac6ef-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q\" (UID: \"684a10a2-03a1-405b-991c-a8aa282ac6ef\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q" Oct 01 12:47:16 crc kubenswrapper[4727]: I1001 12:47:16.262914 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/684a10a2-03a1-405b-991c-a8aa282ac6ef-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q\" (UID: \"684a10a2-03a1-405b-991c-a8aa282ac6ef\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q" Oct 01 12:47:16 crc kubenswrapper[4727]: I1001 12:47:16.291707 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd7gq\" (UniqueName: \"kubernetes.io/projected/684a10a2-03a1-405b-991c-a8aa282ac6ef-kube-api-access-zd7gq\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q\" (UID: \"684a10a2-03a1-405b-991c-a8aa282ac6ef\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q" Oct 01 12:47:16 crc kubenswrapper[4727]: I1001 12:47:16.368253 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q" Oct 01 12:47:16 crc kubenswrapper[4727]: I1001 12:47:16.834642 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q"] Oct 01 12:47:17 crc kubenswrapper[4727]: I1001 12:47:17.762794 4727 generic.go:334] "Generic (PLEG): container finished" podID="684a10a2-03a1-405b-991c-a8aa282ac6ef" containerID="b467903947713241c132dc70b1aeef6efdc92c46bc3d8a7e4813c9655c7b962c" exitCode=0 Oct 01 12:47:17 crc kubenswrapper[4727]: I1001 12:47:17.762850 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q" event={"ID":"684a10a2-03a1-405b-991c-a8aa282ac6ef","Type":"ContainerDied","Data":"b467903947713241c132dc70b1aeef6efdc92c46bc3d8a7e4813c9655c7b962c"} Oct 01 12:47:17 crc kubenswrapper[4727]: I1001 12:47:17.763175 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q" event={"ID":"684a10a2-03a1-405b-991c-a8aa282ac6ef","Type":"ContainerStarted","Data":"9b8e7ecf724b9fb7ddac10447ab8a06efaea5e8ca7d4aab59f22dc702b27bca3"} Oct 01 12:47:19 crc kubenswrapper[4727]: I1001 12:47:19.778852 4727 generic.go:334] "Generic (PLEG): container finished" podID="684a10a2-03a1-405b-991c-a8aa282ac6ef" containerID="c87a10a613c584a28cd6e5b9b566c45cdab2c10818cb497696b7219787085594" exitCode=0 Oct 01 12:47:19 crc kubenswrapper[4727]: I1001 12:47:19.778966 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q" event={"ID":"684a10a2-03a1-405b-991c-a8aa282ac6ef","Type":"ContainerDied","Data":"c87a10a613c584a28cd6e5b9b566c45cdab2c10818cb497696b7219787085594"} Oct 01 12:47:20 crc kubenswrapper[4727]: I1001 12:47:20.790044 4727 generic.go:334] "Generic (PLEG): container finished" podID="684a10a2-03a1-405b-991c-a8aa282ac6ef" containerID="e7bcdd5f78210be17567c4005c73a357e896f93281b490215e9fda2abf38f7ed" exitCode=0 Oct 01 12:47:20 crc kubenswrapper[4727]: I1001 12:47:20.790112 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q" event={"ID":"684a10a2-03a1-405b-991c-a8aa282ac6ef","Type":"ContainerDied","Data":"e7bcdd5f78210be17567c4005c73a357e896f93281b490215e9fda2abf38f7ed"} Oct 01 12:47:22 crc kubenswrapper[4727]: I1001 12:47:22.078713 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q" Oct 01 12:47:22 crc kubenswrapper[4727]: I1001 12:47:22.142480 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd7gq\" (UniqueName: \"kubernetes.io/projected/684a10a2-03a1-405b-991c-a8aa282ac6ef-kube-api-access-zd7gq\") pod \"684a10a2-03a1-405b-991c-a8aa282ac6ef\" (UID: \"684a10a2-03a1-405b-991c-a8aa282ac6ef\") " Oct 01 12:47:22 crc kubenswrapper[4727]: I1001 12:47:22.142686 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/684a10a2-03a1-405b-991c-a8aa282ac6ef-bundle\") pod \"684a10a2-03a1-405b-991c-a8aa282ac6ef\" (UID: \"684a10a2-03a1-405b-991c-a8aa282ac6ef\") " Oct 01 12:47:22 crc kubenswrapper[4727]: I1001 12:47:22.142731 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/684a10a2-03a1-405b-991c-a8aa282ac6ef-util\") pod \"684a10a2-03a1-405b-991c-a8aa282ac6ef\" (UID: \"684a10a2-03a1-405b-991c-a8aa282ac6ef\") " Oct 01 12:47:22 crc kubenswrapper[4727]: I1001 12:47:22.143605 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/684a10a2-03a1-405b-991c-a8aa282ac6ef-bundle" (OuterVolumeSpecName: "bundle") pod "684a10a2-03a1-405b-991c-a8aa282ac6ef" (UID: "684a10a2-03a1-405b-991c-a8aa282ac6ef"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:47:22 crc kubenswrapper[4727]: I1001 12:47:22.148555 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/684a10a2-03a1-405b-991c-a8aa282ac6ef-kube-api-access-zd7gq" (OuterVolumeSpecName: "kube-api-access-zd7gq") pod "684a10a2-03a1-405b-991c-a8aa282ac6ef" (UID: "684a10a2-03a1-405b-991c-a8aa282ac6ef"). InnerVolumeSpecName "kube-api-access-zd7gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:47:22 crc kubenswrapper[4727]: I1001 12:47:22.155441 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/684a10a2-03a1-405b-991c-a8aa282ac6ef-util" (OuterVolumeSpecName: "util") pod "684a10a2-03a1-405b-991c-a8aa282ac6ef" (UID: "684a10a2-03a1-405b-991c-a8aa282ac6ef"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:47:22 crc kubenswrapper[4727]: I1001 12:47:22.243890 4727 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/684a10a2-03a1-405b-991c-a8aa282ac6ef-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:47:22 crc kubenswrapper[4727]: I1001 12:47:22.243920 4727 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/684a10a2-03a1-405b-991c-a8aa282ac6ef-util\") on node \"crc\" DevicePath \"\"" Oct 01 12:47:22 crc kubenswrapper[4727]: I1001 12:47:22.243929 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd7gq\" (UniqueName: \"kubernetes.io/projected/684a10a2-03a1-405b-991c-a8aa282ac6ef-kube-api-access-zd7gq\") on node \"crc\" DevicePath \"\"" Oct 01 12:47:22 crc kubenswrapper[4727]: I1001 12:47:22.805384 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q" event={"ID":"684a10a2-03a1-405b-991c-a8aa282ac6ef","Type":"ContainerDied","Data":"9b8e7ecf724b9fb7ddac10447ab8a06efaea5e8ca7d4aab59f22dc702b27bca3"} Oct 01 12:47:22 crc kubenswrapper[4727]: I1001 12:47:22.805441 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b8e7ecf724b9fb7ddac10447ab8a06efaea5e8ca7d4aab59f22dc702b27bca3" Oct 01 12:47:22 crc kubenswrapper[4727]: I1001 12:47:22.806098 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q" Oct 01 12:47:24 crc kubenswrapper[4727]: I1001 12:47:24.914601 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-gpd58"] Oct 01 12:47:24 crc kubenswrapper[4727]: E1001 12:47:24.914828 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="684a10a2-03a1-405b-991c-a8aa282ac6ef" containerName="util" Oct 01 12:47:24 crc kubenswrapper[4727]: I1001 12:47:24.914841 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="684a10a2-03a1-405b-991c-a8aa282ac6ef" containerName="util" Oct 01 12:47:24 crc kubenswrapper[4727]: E1001 12:47:24.914853 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="684a10a2-03a1-405b-991c-a8aa282ac6ef" containerName="pull" Oct 01 12:47:24 crc kubenswrapper[4727]: I1001 12:47:24.914859 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="684a10a2-03a1-405b-991c-a8aa282ac6ef" containerName="pull" Oct 01 12:47:24 crc kubenswrapper[4727]: E1001 12:47:24.914865 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="684a10a2-03a1-405b-991c-a8aa282ac6ef" containerName="extract" Oct 01 12:47:24 crc kubenswrapper[4727]: I1001 12:47:24.914872 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="684a10a2-03a1-405b-991c-a8aa282ac6ef" containerName="extract" Oct 01 12:47:24 crc kubenswrapper[4727]: I1001 12:47:24.914971 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="684a10a2-03a1-405b-991c-a8aa282ac6ef" containerName="extract" Oct 01 12:47:24 crc kubenswrapper[4727]: I1001 12:47:24.915367 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-gpd58" Oct 01 12:47:24 crc kubenswrapper[4727]: I1001 12:47:24.917406 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 01 12:47:24 crc kubenswrapper[4727]: I1001 12:47:24.917605 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-rlvr7" Oct 01 12:47:24 crc kubenswrapper[4727]: I1001 12:47:24.918779 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 01 12:47:24 crc kubenswrapper[4727]: I1001 12:47:24.963960 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-gpd58"] Oct 01 12:47:24 crc kubenswrapper[4727]: I1001 12:47:24.978109 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2prp\" (UniqueName: \"kubernetes.io/projected/872ad5b0-13ab-4c3f-aa66-4a2b7f8ca2b2-kube-api-access-s2prp\") pod \"nmstate-operator-5d6f6cfd66-gpd58\" (UID: \"872ad5b0-13ab-4c3f-aa66-4a2b7f8ca2b2\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-gpd58" Oct 01 12:47:25 crc kubenswrapper[4727]: I1001 12:47:25.080117 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2prp\" (UniqueName: \"kubernetes.io/projected/872ad5b0-13ab-4c3f-aa66-4a2b7f8ca2b2-kube-api-access-s2prp\") pod \"nmstate-operator-5d6f6cfd66-gpd58\" (UID: \"872ad5b0-13ab-4c3f-aa66-4a2b7f8ca2b2\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-gpd58" Oct 01 12:47:25 crc kubenswrapper[4727]: I1001 12:47:25.098471 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2prp\" (UniqueName: \"kubernetes.io/projected/872ad5b0-13ab-4c3f-aa66-4a2b7f8ca2b2-kube-api-access-s2prp\") pod \"nmstate-operator-5d6f6cfd66-gpd58\" (UID: \"872ad5b0-13ab-4c3f-aa66-4a2b7f8ca2b2\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-gpd58" Oct 01 12:47:25 crc kubenswrapper[4727]: I1001 12:47:25.228059 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-gpd58" Oct 01 12:47:25 crc kubenswrapper[4727]: I1001 12:47:25.438584 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-gpd58"] Oct 01 12:47:25 crc kubenswrapper[4727]: I1001 12:47:25.824374 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-gpd58" event={"ID":"872ad5b0-13ab-4c3f-aa66-4a2b7f8ca2b2","Type":"ContainerStarted","Data":"11cf9e0e42efd9e9d343e4b879f8289e97de072585d194838612ae8c4c117c7c"} Oct 01 12:47:27 crc kubenswrapper[4727]: I1001 12:47:27.847844 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-gpd58" event={"ID":"872ad5b0-13ab-4c3f-aa66-4a2b7f8ca2b2","Type":"ContainerStarted","Data":"7bb6d061973acffb50ca321f69f6d44bd232c9c0d542149da81c9b293a013ade"} Oct 01 12:47:27 crc kubenswrapper[4727]: I1001 12:47:27.863481 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-gpd58" podStartSLOduration=1.921567183 podStartE2EDuration="3.86346072s" podCreationTimestamp="2025-10-01 12:47:24 +0000 UTC" firstStartedPulling="2025-10-01 12:47:25.448053728 +0000 UTC m=+623.769408555" lastFinishedPulling="2025-10-01 12:47:27.389947255 +0000 UTC m=+625.711302092" observedRunningTime="2025-10-01 12:47:27.862053555 +0000 UTC m=+626.183408422" watchObservedRunningTime="2025-10-01 12:47:27.86346072 +0000 UTC m=+626.184815567" Oct 01 12:47:28 crc kubenswrapper[4727]: I1001 12:47:28.933380 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-nclq7"] Oct 01 12:47:28 crc kubenswrapper[4727]: I1001 12:47:28.934312 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-nclq7" Oct 01 12:47:28 crc kubenswrapper[4727]: I1001 12:47:28.938384 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-gq9ml" Oct 01 12:47:28 crc kubenswrapper[4727]: I1001 12:47:28.946666 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-nclq7"] Oct 01 12:47:28 crc kubenswrapper[4727]: I1001 12:47:28.956194 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-7vwzd"] Oct 01 12:47:28 crc kubenswrapper[4727]: I1001 12:47:28.956939 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-7vwzd" Oct 01 12:47:28 crc kubenswrapper[4727]: I1001 12:47:28.961532 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 01 12:47:28 crc kubenswrapper[4727]: I1001 12:47:28.979621 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-bzph7"] Oct 01 12:47:28 crc kubenswrapper[4727]: I1001 12:47:28.985884 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-7vwzd"] Oct 01 12:47:28 crc kubenswrapper[4727]: I1001 12:47:28.986041 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-bzph7" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.056720 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/307c2e1d-d5d0-4f21-a708-0a43cb624fff-dbus-socket\") pod \"nmstate-handler-bzph7\" (UID: \"307c2e1d-d5d0-4f21-a708-0a43cb624fff\") " pod="openshift-nmstate/nmstate-handler-bzph7" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.056832 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/724b6b3d-215d-4ccc-a966-fc58f517a29f-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-7vwzd\" (UID: \"724b6b3d-215d-4ccc-a966-fc58f517a29f\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-7vwzd" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.056884 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/307c2e1d-d5d0-4f21-a708-0a43cb624fff-ovs-socket\") pod \"nmstate-handler-bzph7\" (UID: \"307c2e1d-d5d0-4f21-a708-0a43cb624fff\") " pod="openshift-nmstate/nmstate-handler-bzph7" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.056915 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88qnt\" (UniqueName: \"kubernetes.io/projected/a429079b-262e-4f8b-9fc8-4dc0ad068fd5-kube-api-access-88qnt\") pod \"nmstate-metrics-58fcddf996-nclq7\" (UID: \"a429079b-262e-4f8b-9fc8-4dc0ad068fd5\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-nclq7" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.056985 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7xqp\" (UniqueName: \"kubernetes.io/projected/307c2e1d-d5d0-4f21-a708-0a43cb624fff-kube-api-access-r7xqp\") pod \"nmstate-handler-bzph7\" (UID: \"307c2e1d-d5d0-4f21-a708-0a43cb624fff\") " pod="openshift-nmstate/nmstate-handler-bzph7" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.057062 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stzqf\" (UniqueName: \"kubernetes.io/projected/724b6b3d-215d-4ccc-a966-fc58f517a29f-kube-api-access-stzqf\") pod \"nmstate-webhook-6d689559c5-7vwzd\" (UID: \"724b6b3d-215d-4ccc-a966-fc58f517a29f\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-7vwzd" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.057114 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/307c2e1d-d5d0-4f21-a708-0a43cb624fff-nmstate-lock\") pod \"nmstate-handler-bzph7\" (UID: \"307c2e1d-d5d0-4f21-a708-0a43cb624fff\") " pod="openshift-nmstate/nmstate-handler-bzph7" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.153296 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wbtm2"] Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.153960 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wbtm2" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.156643 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.156935 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-x8crp" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.158108 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/307c2e1d-d5d0-4f21-a708-0a43cb624fff-ovs-socket\") pod \"nmstate-handler-bzph7\" (UID: \"307c2e1d-d5d0-4f21-a708-0a43cb624fff\") " pod="openshift-nmstate/nmstate-handler-bzph7" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.158158 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88qnt\" (UniqueName: \"kubernetes.io/projected/a429079b-262e-4f8b-9fc8-4dc0ad068fd5-kube-api-access-88qnt\") pod \"nmstate-metrics-58fcddf996-nclq7\" (UID: \"a429079b-262e-4f8b-9fc8-4dc0ad068fd5\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-nclq7" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.158937 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/307c2e1d-d5d0-4f21-a708-0a43cb624fff-ovs-socket\") pod \"nmstate-handler-bzph7\" (UID: \"307c2e1d-d5d0-4f21-a708-0a43cb624fff\") " pod="openshift-nmstate/nmstate-handler-bzph7" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.158989 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7xqp\" (UniqueName: \"kubernetes.io/projected/307c2e1d-d5d0-4f21-a708-0a43cb624fff-kube-api-access-r7xqp\") pod \"nmstate-handler-bzph7\" (UID: \"307c2e1d-d5d0-4f21-a708-0a43cb624fff\") " pod="openshift-nmstate/nmstate-handler-bzph7" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.159424 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stzqf\" (UniqueName: \"kubernetes.io/projected/724b6b3d-215d-4ccc-a966-fc58f517a29f-kube-api-access-stzqf\") pod \"nmstate-webhook-6d689559c5-7vwzd\" (UID: \"724b6b3d-215d-4ccc-a966-fc58f517a29f\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-7vwzd" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.159452 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/307c2e1d-d5d0-4f21-a708-0a43cb624fff-nmstate-lock\") pod \"nmstate-handler-bzph7\" (UID: \"307c2e1d-d5d0-4f21-a708-0a43cb624fff\") " pod="openshift-nmstate/nmstate-handler-bzph7" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.159530 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/307c2e1d-d5d0-4f21-a708-0a43cb624fff-dbus-socket\") pod \"nmstate-handler-bzph7\" (UID: \"307c2e1d-d5d0-4f21-a708-0a43cb624fff\") " pod="openshift-nmstate/nmstate-handler-bzph7" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.159588 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/724b6b3d-215d-4ccc-a966-fc58f517a29f-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-7vwzd\" (UID: \"724b6b3d-215d-4ccc-a966-fc58f517a29f\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-7vwzd" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.160099 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/307c2e1d-d5d0-4f21-a708-0a43cb624fff-nmstate-lock\") pod \"nmstate-handler-bzph7\" (UID: \"307c2e1d-d5d0-4f21-a708-0a43cb624fff\") " pod="openshift-nmstate/nmstate-handler-bzph7" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.160306 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wbtm2"] Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.160435 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/307c2e1d-d5d0-4f21-a708-0a43cb624fff-dbus-socket\") pod \"nmstate-handler-bzph7\" (UID: \"307c2e1d-d5d0-4f21-a708-0a43cb624fff\") " pod="openshift-nmstate/nmstate-handler-bzph7" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.162281 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.166154 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/724b6b3d-215d-4ccc-a966-fc58f517a29f-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-7vwzd\" (UID: \"724b6b3d-215d-4ccc-a966-fc58f517a29f\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-7vwzd" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.182605 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7xqp\" (UniqueName: \"kubernetes.io/projected/307c2e1d-d5d0-4f21-a708-0a43cb624fff-kube-api-access-r7xqp\") pod \"nmstate-handler-bzph7\" (UID: \"307c2e1d-d5d0-4f21-a708-0a43cb624fff\") " pod="openshift-nmstate/nmstate-handler-bzph7" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.185616 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stzqf\" (UniqueName: \"kubernetes.io/projected/724b6b3d-215d-4ccc-a966-fc58f517a29f-kube-api-access-stzqf\") pod \"nmstate-webhook-6d689559c5-7vwzd\" (UID: \"724b6b3d-215d-4ccc-a966-fc58f517a29f\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-7vwzd" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.190137 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88qnt\" (UniqueName: \"kubernetes.io/projected/a429079b-262e-4f8b-9fc8-4dc0ad068fd5-kube-api-access-88qnt\") pod \"nmstate-metrics-58fcddf996-nclq7\" (UID: \"a429079b-262e-4f8b-9fc8-4dc0ad068fd5\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-nclq7" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.255987 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-nclq7" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.260855 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf5h8\" (UniqueName: \"kubernetes.io/projected/e5813fa0-c580-4ab9-8118-b8dc8ff39470-kube-api-access-hf5h8\") pod \"nmstate-console-plugin-864bb6dfb5-wbtm2\" (UID: \"e5813fa0-c580-4ab9-8118-b8dc8ff39470\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wbtm2" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.260925 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e5813fa0-c580-4ab9-8118-b8dc8ff39470-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-wbtm2\" (UID: \"e5813fa0-c580-4ab9-8118-b8dc8ff39470\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wbtm2" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.261158 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5813fa0-c580-4ab9-8118-b8dc8ff39470-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-wbtm2\" (UID: \"e5813fa0-c580-4ab9-8118-b8dc8ff39470\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wbtm2" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.272627 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-7vwzd" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.301374 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-bzph7" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.362771 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf5h8\" (UniqueName: \"kubernetes.io/projected/e5813fa0-c580-4ab9-8118-b8dc8ff39470-kube-api-access-hf5h8\") pod \"nmstate-console-plugin-864bb6dfb5-wbtm2\" (UID: \"e5813fa0-c580-4ab9-8118-b8dc8ff39470\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wbtm2" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.362834 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e5813fa0-c580-4ab9-8118-b8dc8ff39470-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-wbtm2\" (UID: \"e5813fa0-c580-4ab9-8118-b8dc8ff39470\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wbtm2" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.362864 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5813fa0-c580-4ab9-8118-b8dc8ff39470-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-wbtm2\" (UID: \"e5813fa0-c580-4ab9-8118-b8dc8ff39470\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wbtm2" Oct 01 12:47:29 crc kubenswrapper[4727]: E1001 12:47:29.363035 4727 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Oct 01 12:47:29 crc kubenswrapper[4727]: E1001 12:47:29.363104 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5813fa0-c580-4ab9-8118-b8dc8ff39470-plugin-serving-cert podName:e5813fa0-c580-4ab9-8118-b8dc8ff39470 nodeName:}" failed. No retries permitted until 2025-10-01 12:47:29.863082514 +0000 UTC m=+628.184437351 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/e5813fa0-c580-4ab9-8118-b8dc8ff39470-plugin-serving-cert") pod "nmstate-console-plugin-864bb6dfb5-wbtm2" (UID: "e5813fa0-c580-4ab9-8118-b8dc8ff39470") : secret "plugin-serving-cert" not found Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.364798 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e5813fa0-c580-4ab9-8118-b8dc8ff39470-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-wbtm2\" (UID: \"e5813fa0-c580-4ab9-8118-b8dc8ff39470\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wbtm2" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.368105 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6497c8cbfb-b679m"] Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.368810 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6497c8cbfb-b679m" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.376079 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6497c8cbfb-b679m"] Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.388223 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf5h8\" (UniqueName: \"kubernetes.io/projected/e5813fa0-c580-4ab9-8118-b8dc8ff39470-kube-api-access-hf5h8\") pod \"nmstate-console-plugin-864bb6dfb5-wbtm2\" (UID: \"e5813fa0-c580-4ab9-8118-b8dc8ff39470\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wbtm2" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.463677 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/335d2042-1647-4c4a-8c6a-49991c33d1f6-console-serving-cert\") pod \"console-6497c8cbfb-b679m\" (UID: \"335d2042-1647-4c4a-8c6a-49991c33d1f6\") " pod="openshift-console/console-6497c8cbfb-b679m" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.463752 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/335d2042-1647-4c4a-8c6a-49991c33d1f6-trusted-ca-bundle\") pod \"console-6497c8cbfb-b679m\" (UID: \"335d2042-1647-4c4a-8c6a-49991c33d1f6\") " pod="openshift-console/console-6497c8cbfb-b679m" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.463776 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/335d2042-1647-4c4a-8c6a-49991c33d1f6-console-config\") pod \"console-6497c8cbfb-b679m\" (UID: \"335d2042-1647-4c4a-8c6a-49991c33d1f6\") " pod="openshift-console/console-6497c8cbfb-b679m" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.463972 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/335d2042-1647-4c4a-8c6a-49991c33d1f6-oauth-serving-cert\") pod \"console-6497c8cbfb-b679m\" (UID: \"335d2042-1647-4c4a-8c6a-49991c33d1f6\") " pod="openshift-console/console-6497c8cbfb-b679m" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.464089 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pch69\" (UniqueName: \"kubernetes.io/projected/335d2042-1647-4c4a-8c6a-49991c33d1f6-kube-api-access-pch69\") pod \"console-6497c8cbfb-b679m\" (UID: \"335d2042-1647-4c4a-8c6a-49991c33d1f6\") " pod="openshift-console/console-6497c8cbfb-b679m" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.464162 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/335d2042-1647-4c4a-8c6a-49991c33d1f6-service-ca\") pod \"console-6497c8cbfb-b679m\" (UID: \"335d2042-1647-4c4a-8c6a-49991c33d1f6\") " pod="openshift-console/console-6497c8cbfb-b679m" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.464223 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/335d2042-1647-4c4a-8c6a-49991c33d1f6-console-oauth-config\") pod \"console-6497c8cbfb-b679m\" (UID: \"335d2042-1647-4c4a-8c6a-49991c33d1f6\") " pod="openshift-console/console-6497c8cbfb-b679m" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.540680 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-7vwzd"] Oct 01 12:47:29 crc kubenswrapper[4727]: W1001 12:47:29.545327 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod724b6b3d_215d_4ccc_a966_fc58f517a29f.slice/crio-7bef68d6fde08fc6707fdffb5e4c81a72f82e39e1898deabc035ec98f78be330 WatchSource:0}: Error finding container 7bef68d6fde08fc6707fdffb5e4c81a72f82e39e1898deabc035ec98f78be330: Status 404 returned error can't find the container with id 7bef68d6fde08fc6707fdffb5e4c81a72f82e39e1898deabc035ec98f78be330 Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.565717 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/335d2042-1647-4c4a-8c6a-49991c33d1f6-trusted-ca-bundle\") pod \"console-6497c8cbfb-b679m\" (UID: \"335d2042-1647-4c4a-8c6a-49991c33d1f6\") " pod="openshift-console/console-6497c8cbfb-b679m" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.567847 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/335d2042-1647-4c4a-8c6a-49991c33d1f6-trusted-ca-bundle\") pod \"console-6497c8cbfb-b679m\" (UID: \"335d2042-1647-4c4a-8c6a-49991c33d1f6\") " pod="openshift-console/console-6497c8cbfb-b679m" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.567953 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/335d2042-1647-4c4a-8c6a-49991c33d1f6-console-config\") pod \"console-6497c8cbfb-b679m\" (UID: \"335d2042-1647-4c4a-8c6a-49991c33d1f6\") " pod="openshift-console/console-6497c8cbfb-b679m" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.569159 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/335d2042-1647-4c4a-8c6a-49991c33d1f6-oauth-serving-cert\") pod \"console-6497c8cbfb-b679m\" (UID: \"335d2042-1647-4c4a-8c6a-49991c33d1f6\") " pod="openshift-console/console-6497c8cbfb-b679m" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.569231 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pch69\" (UniqueName: \"kubernetes.io/projected/335d2042-1647-4c4a-8c6a-49991c33d1f6-kube-api-access-pch69\") pod \"console-6497c8cbfb-b679m\" (UID: \"335d2042-1647-4c4a-8c6a-49991c33d1f6\") " pod="openshift-console/console-6497c8cbfb-b679m" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.569295 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/335d2042-1647-4c4a-8c6a-49991c33d1f6-service-ca\") pod \"console-6497c8cbfb-b679m\" (UID: \"335d2042-1647-4c4a-8c6a-49991c33d1f6\") " pod="openshift-console/console-6497c8cbfb-b679m" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.569345 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/335d2042-1647-4c4a-8c6a-49991c33d1f6-console-oauth-config\") pod \"console-6497c8cbfb-b679m\" (UID: \"335d2042-1647-4c4a-8c6a-49991c33d1f6\") " pod="openshift-console/console-6497c8cbfb-b679m" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.569392 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/335d2042-1647-4c4a-8c6a-49991c33d1f6-console-serving-cert\") pod \"console-6497c8cbfb-b679m\" (UID: \"335d2042-1647-4c4a-8c6a-49991c33d1f6\") " pod="openshift-console/console-6497c8cbfb-b679m" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.569615 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/335d2042-1647-4c4a-8c6a-49991c33d1f6-console-config\") pod \"console-6497c8cbfb-b679m\" (UID: \"335d2042-1647-4c4a-8c6a-49991c33d1f6\") " pod="openshift-console/console-6497c8cbfb-b679m" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.570560 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/335d2042-1647-4c4a-8c6a-49991c33d1f6-oauth-serving-cert\") pod \"console-6497c8cbfb-b679m\" (UID: \"335d2042-1647-4c4a-8c6a-49991c33d1f6\") " pod="openshift-console/console-6497c8cbfb-b679m" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.571133 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/335d2042-1647-4c4a-8c6a-49991c33d1f6-service-ca\") pod \"console-6497c8cbfb-b679m\" (UID: \"335d2042-1647-4c4a-8c6a-49991c33d1f6\") " pod="openshift-console/console-6497c8cbfb-b679m" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.575285 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/335d2042-1647-4c4a-8c6a-49991c33d1f6-console-serving-cert\") pod \"console-6497c8cbfb-b679m\" (UID: \"335d2042-1647-4c4a-8c6a-49991c33d1f6\") " pod="openshift-console/console-6497c8cbfb-b679m" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.576762 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/335d2042-1647-4c4a-8c6a-49991c33d1f6-console-oauth-config\") pod \"console-6497c8cbfb-b679m\" (UID: \"335d2042-1647-4c4a-8c6a-49991c33d1f6\") " pod="openshift-console/console-6497c8cbfb-b679m" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.588664 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-nclq7"] Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.593477 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pch69\" (UniqueName: \"kubernetes.io/projected/335d2042-1647-4c4a-8c6a-49991c33d1f6-kube-api-access-pch69\") pod \"console-6497c8cbfb-b679m\" (UID: \"335d2042-1647-4c4a-8c6a-49991c33d1f6\") " pod="openshift-console/console-6497c8cbfb-b679m" Oct 01 12:47:29 crc kubenswrapper[4727]: W1001 12:47:29.597381 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda429079b_262e_4f8b_9fc8_4dc0ad068fd5.slice/crio-bbaee244b08a39fa8b0af3ec2c9a763832f8cb423396eb33ada131453c077209 WatchSource:0}: Error finding container bbaee244b08a39fa8b0af3ec2c9a763832f8cb423396eb33ada131453c077209: Status 404 returned error can't find the container with id bbaee244b08a39fa8b0af3ec2c9a763832f8cb423396eb33ada131453c077209 Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.686333 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6497c8cbfb-b679m" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.875482 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-7vwzd" event={"ID":"724b6b3d-215d-4ccc-a966-fc58f517a29f","Type":"ContainerStarted","Data":"7bef68d6fde08fc6707fdffb5e4c81a72f82e39e1898deabc035ec98f78be330"} Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.875926 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5813fa0-c580-4ab9-8118-b8dc8ff39470-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-wbtm2\" (UID: \"e5813fa0-c580-4ab9-8118-b8dc8ff39470\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wbtm2" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.877555 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-nclq7" event={"ID":"a429079b-262e-4f8b-9fc8-4dc0ad068fd5","Type":"ContainerStarted","Data":"bbaee244b08a39fa8b0af3ec2c9a763832f8cb423396eb33ada131453c077209"} Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.878391 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-bzph7" event={"ID":"307c2e1d-d5d0-4f21-a708-0a43cb624fff","Type":"ContainerStarted","Data":"6c12bcf0a5bc3e5807e2d0033c82cd34909a154711a870e10b1fe3c72bafdf22"} Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.879950 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5813fa0-c580-4ab9-8118-b8dc8ff39470-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-wbtm2\" (UID: \"e5813fa0-c580-4ab9-8118-b8dc8ff39470\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wbtm2" Oct 01 12:47:29 crc kubenswrapper[4727]: I1001 12:47:29.913601 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6497c8cbfb-b679m"] Oct 01 12:47:29 crc kubenswrapper[4727]: W1001 12:47:29.920067 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod335d2042_1647_4c4a_8c6a_49991c33d1f6.slice/crio-1d921e621fed918bdccdf97424583e22ac84cb994d9fd5db57dc79152378b465 WatchSource:0}: Error finding container 1d921e621fed918bdccdf97424583e22ac84cb994d9fd5db57dc79152378b465: Status 404 returned error can't find the container with id 1d921e621fed918bdccdf97424583e22ac84cb994d9fd5db57dc79152378b465 Oct 01 12:47:30 crc kubenswrapper[4727]: I1001 12:47:30.120271 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wbtm2" Oct 01 12:47:30 crc kubenswrapper[4727]: I1001 12:47:30.310490 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wbtm2"] Oct 01 12:47:30 crc kubenswrapper[4727]: I1001 12:47:30.903781 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6497c8cbfb-b679m" event={"ID":"335d2042-1647-4c4a-8c6a-49991c33d1f6","Type":"ContainerStarted","Data":"8c6c9e4190ccea8375b1abc86587ac573e267714f0c3e5d966b0eccb93709b55"} Oct 01 12:47:30 crc kubenswrapper[4727]: I1001 12:47:30.904357 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6497c8cbfb-b679m" event={"ID":"335d2042-1647-4c4a-8c6a-49991c33d1f6","Type":"ContainerStarted","Data":"1d921e621fed918bdccdf97424583e22ac84cb994d9fd5db57dc79152378b465"} Oct 01 12:47:30 crc kubenswrapper[4727]: I1001 12:47:30.905956 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wbtm2" event={"ID":"e5813fa0-c580-4ab9-8118-b8dc8ff39470","Type":"ContainerStarted","Data":"b43328a4b42187dcf69d5d29199a1c794ee786ce57c471fb423d1a49e473d4a5"} Oct 01 12:47:30 crc kubenswrapper[4727]: I1001 12:47:30.941476 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6497c8cbfb-b679m" podStartSLOduration=1.941450077 podStartE2EDuration="1.941450077s" podCreationTimestamp="2025-10-01 12:47:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:47:30.934970021 +0000 UTC m=+629.256324868" watchObservedRunningTime="2025-10-01 12:47:30.941450077 +0000 UTC m=+629.262804914" Oct 01 12:47:33 crc kubenswrapper[4727]: I1001 12:47:33.927878 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-nclq7" event={"ID":"a429079b-262e-4f8b-9fc8-4dc0ad068fd5","Type":"ContainerStarted","Data":"69bf1e66b6855c56f31988903841aecb19e9eda625114c9174c02eaa088e5964"} Oct 01 12:47:33 crc kubenswrapper[4727]: I1001 12:47:33.929702 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-bzph7" event={"ID":"307c2e1d-d5d0-4f21-a708-0a43cb624fff","Type":"ContainerStarted","Data":"edcaa6d7cdf456502dc6941be8c9db8548d882b4c842629a3f2469b74c50abd2"} Oct 01 12:47:33 crc kubenswrapper[4727]: I1001 12:47:33.929865 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-bzph7" Oct 01 12:47:33 crc kubenswrapper[4727]: I1001 12:47:33.932537 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wbtm2" event={"ID":"e5813fa0-c580-4ab9-8118-b8dc8ff39470","Type":"ContainerStarted","Data":"da24acbc6facb0dead72572fe1c07e45a6294f375755ff257a95b97c1759581a"} Oct 01 12:47:33 crc kubenswrapper[4727]: I1001 12:47:33.934063 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-7vwzd" event={"ID":"724b6b3d-215d-4ccc-a966-fc58f517a29f","Type":"ContainerStarted","Data":"77f309690b958e6e90f8eddda6047202b33326da329f1fb0fd1d6c77ff5c01c6"} Oct 01 12:47:33 crc kubenswrapper[4727]: I1001 12:47:33.934195 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6d689559c5-7vwzd" Oct 01 12:47:33 crc kubenswrapper[4727]: I1001 12:47:33.945070 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-bzph7" podStartSLOduration=1.869136622 podStartE2EDuration="5.945049593s" podCreationTimestamp="2025-10-01 12:47:28 +0000 UTC" firstStartedPulling="2025-10-01 12:47:29.322523427 +0000 UTC m=+627.643878264" lastFinishedPulling="2025-10-01 12:47:33.398436358 +0000 UTC m=+631.719791235" observedRunningTime="2025-10-01 12:47:33.942192773 +0000 UTC m=+632.263547630" watchObservedRunningTime="2025-10-01 12:47:33.945049593 +0000 UTC m=+632.266404430" Oct 01 12:47:33 crc kubenswrapper[4727]: I1001 12:47:33.976540 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-wbtm2" podStartSLOduration=1.91031705 podStartE2EDuration="4.976520962s" podCreationTimestamp="2025-10-01 12:47:29 +0000 UTC" firstStartedPulling="2025-10-01 12:47:30.317275251 +0000 UTC m=+628.638630088" lastFinishedPulling="2025-10-01 12:47:33.383479133 +0000 UTC m=+631.704834000" observedRunningTime="2025-10-01 12:47:33.974940941 +0000 UTC m=+632.296295788" watchObservedRunningTime="2025-10-01 12:47:33.976520962 +0000 UTC m=+632.297875799" Oct 01 12:47:33 crc kubenswrapper[4727]: I1001 12:47:33.978644 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6d689559c5-7vwzd" podStartSLOduration=2.143085684 podStartE2EDuration="5.978635628s" podCreationTimestamp="2025-10-01 12:47:28 +0000 UTC" firstStartedPulling="2025-10-01 12:47:29.549492039 +0000 UTC m=+627.870846876" lastFinishedPulling="2025-10-01 12:47:33.385041963 +0000 UTC m=+631.706396820" observedRunningTime="2025-10-01 12:47:33.959882093 +0000 UTC m=+632.281236950" watchObservedRunningTime="2025-10-01 12:47:33.978635628 +0000 UTC m=+632.299990475" Oct 01 12:47:35 crc kubenswrapper[4727]: I1001 12:47:35.947378 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-nclq7" event={"ID":"a429079b-262e-4f8b-9fc8-4dc0ad068fd5","Type":"ContainerStarted","Data":"04d393c47a5af91720e37740c999e4ea52cc91d47eb3f949e54564d45d3b28ea"} Oct 01 12:47:35 crc kubenswrapper[4727]: I1001 12:47:35.963464 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58fcddf996-nclq7" podStartSLOduration=1.790559958 podStartE2EDuration="7.963442708s" podCreationTimestamp="2025-10-01 12:47:28 +0000 UTC" firstStartedPulling="2025-10-01 12:47:29.599705112 +0000 UTC m=+627.921059949" lastFinishedPulling="2025-10-01 12:47:35.772587862 +0000 UTC m=+634.093942699" observedRunningTime="2025-10-01 12:47:35.962047493 +0000 UTC m=+634.283402350" watchObservedRunningTime="2025-10-01 12:47:35.963442708 +0000 UTC m=+634.284797545" Oct 01 12:47:39 crc kubenswrapper[4727]: I1001 12:47:39.324391 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-bzph7" Oct 01 12:47:39 crc kubenswrapper[4727]: I1001 12:47:39.687046 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6497c8cbfb-b679m" Oct 01 12:47:39 crc kubenswrapper[4727]: I1001 12:47:39.687332 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6497c8cbfb-b679m" Oct 01 12:47:39 crc kubenswrapper[4727]: I1001 12:47:39.692750 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6497c8cbfb-b679m" Oct 01 12:47:39 crc kubenswrapper[4727]: I1001 12:47:39.979660 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6497c8cbfb-b679m" Oct 01 12:47:40 crc kubenswrapper[4727]: I1001 12:47:40.032028 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-89tz6"] Oct 01 12:47:49 crc kubenswrapper[4727]: I1001 12:47:49.278228 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6d689559c5-7vwzd" Oct 01 12:48:03 crc kubenswrapper[4727]: I1001 12:48:03.176508 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw"] Oct 01 12:48:03 crc kubenswrapper[4727]: I1001 12:48:03.178267 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw" Oct 01 12:48:03 crc kubenswrapper[4727]: I1001 12:48:03.180359 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 01 12:48:03 crc kubenswrapper[4727]: I1001 12:48:03.190403 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw"] Oct 01 12:48:03 crc kubenswrapper[4727]: I1001 12:48:03.236487 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e0c2a76-78ea-4d0c-bfd4-541f729a20a0-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw\" (UID: \"3e0c2a76-78ea-4d0c-bfd4-541f729a20a0\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw" Oct 01 12:48:03 crc kubenswrapper[4727]: I1001 12:48:03.236562 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e0c2a76-78ea-4d0c-bfd4-541f729a20a0-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw\" (UID: \"3e0c2a76-78ea-4d0c-bfd4-541f729a20a0\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw" Oct 01 12:48:03 crc kubenswrapper[4727]: I1001 12:48:03.236604 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcwpb\" (UniqueName: \"kubernetes.io/projected/3e0c2a76-78ea-4d0c-bfd4-541f729a20a0-kube-api-access-wcwpb\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw\" (UID: \"3e0c2a76-78ea-4d0c-bfd4-541f729a20a0\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw" Oct 01 12:48:03 crc kubenswrapper[4727]: I1001 12:48:03.337928 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e0c2a76-78ea-4d0c-bfd4-541f729a20a0-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw\" (UID: \"3e0c2a76-78ea-4d0c-bfd4-541f729a20a0\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw" Oct 01 12:48:03 crc kubenswrapper[4727]: I1001 12:48:03.338035 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcwpb\" (UniqueName: \"kubernetes.io/projected/3e0c2a76-78ea-4d0c-bfd4-541f729a20a0-kube-api-access-wcwpb\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw\" (UID: \"3e0c2a76-78ea-4d0c-bfd4-541f729a20a0\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw" Oct 01 12:48:03 crc kubenswrapper[4727]: I1001 12:48:03.338099 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e0c2a76-78ea-4d0c-bfd4-541f729a20a0-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw\" (UID: \"3e0c2a76-78ea-4d0c-bfd4-541f729a20a0\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw" Oct 01 12:48:03 crc kubenswrapper[4727]: I1001 12:48:03.338702 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e0c2a76-78ea-4d0c-bfd4-541f729a20a0-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw\" (UID: \"3e0c2a76-78ea-4d0c-bfd4-541f729a20a0\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw" Oct 01 12:48:03 crc kubenswrapper[4727]: I1001 12:48:03.338773 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e0c2a76-78ea-4d0c-bfd4-541f729a20a0-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw\" (UID: \"3e0c2a76-78ea-4d0c-bfd4-541f729a20a0\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw" Oct 01 12:48:03 crc kubenswrapper[4727]: I1001 12:48:03.360730 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcwpb\" (UniqueName: \"kubernetes.io/projected/3e0c2a76-78ea-4d0c-bfd4-541f729a20a0-kube-api-access-wcwpb\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw\" (UID: \"3e0c2a76-78ea-4d0c-bfd4-541f729a20a0\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw" Oct 01 12:48:03 crc kubenswrapper[4727]: I1001 12:48:03.501685 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw" Oct 01 12:48:03 crc kubenswrapper[4727]: I1001 12:48:03.710214 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw"] Oct 01 12:48:04 crc kubenswrapper[4727]: I1001 12:48:04.128144 4727 generic.go:334] "Generic (PLEG): container finished" podID="3e0c2a76-78ea-4d0c-bfd4-541f729a20a0" containerID="4518b267f1fd8d73802b085219bd471efca98b24757f6ae8faa357a9463b7c30" exitCode=0 Oct 01 12:48:04 crc kubenswrapper[4727]: I1001 12:48:04.128318 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw" event={"ID":"3e0c2a76-78ea-4d0c-bfd4-541f729a20a0","Type":"ContainerDied","Data":"4518b267f1fd8d73802b085219bd471efca98b24757f6ae8faa357a9463b7c30"} Oct 01 12:48:04 crc kubenswrapper[4727]: I1001 12:48:04.128608 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw" event={"ID":"3e0c2a76-78ea-4d0c-bfd4-541f729a20a0","Type":"ContainerStarted","Data":"bba258a2ffc6f4f67dcb67c2f6f2beefd290628e0291ea415ca44cad703d2909"} Oct 01 12:48:05 crc kubenswrapper[4727]: I1001 12:48:05.090344 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-89tz6" podUID="366b7e92-ea45-4052-8ddc-9540d534a7ad" containerName="console" containerID="cri-o://d6e6cc3e00f68f1a3411ffb99a4bd7381075b21f032ba8875d5d889f2c3922e0" gracePeriod=15 Oct 01 12:48:05 crc kubenswrapper[4727]: I1001 12:48:05.491849 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-89tz6_366b7e92-ea45-4052-8ddc-9540d534a7ad/console/0.log" Oct 01 12:48:05 crc kubenswrapper[4727]: I1001 12:48:05.492030 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:48:05 crc kubenswrapper[4727]: I1001 12:48:05.567727 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/366b7e92-ea45-4052-8ddc-9540d534a7ad-oauth-serving-cert\") pod \"366b7e92-ea45-4052-8ddc-9540d534a7ad\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " Oct 01 12:48:05 crc kubenswrapper[4727]: I1001 12:48:05.567793 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/366b7e92-ea45-4052-8ddc-9540d534a7ad-console-serving-cert\") pod \"366b7e92-ea45-4052-8ddc-9540d534a7ad\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " Oct 01 12:48:05 crc kubenswrapper[4727]: I1001 12:48:05.567820 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/366b7e92-ea45-4052-8ddc-9540d534a7ad-console-oauth-config\") pod \"366b7e92-ea45-4052-8ddc-9540d534a7ad\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " Oct 01 12:48:05 crc kubenswrapper[4727]: I1001 12:48:05.567859 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/366b7e92-ea45-4052-8ddc-9540d534a7ad-trusted-ca-bundle\") pod \"366b7e92-ea45-4052-8ddc-9540d534a7ad\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " Oct 01 12:48:05 crc kubenswrapper[4727]: I1001 12:48:05.567887 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/366b7e92-ea45-4052-8ddc-9540d534a7ad-service-ca\") pod \"366b7e92-ea45-4052-8ddc-9540d534a7ad\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " Oct 01 12:48:05 crc kubenswrapper[4727]: I1001 12:48:05.567914 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xt8w\" (UniqueName: \"kubernetes.io/projected/366b7e92-ea45-4052-8ddc-9540d534a7ad-kube-api-access-2xt8w\") pod \"366b7e92-ea45-4052-8ddc-9540d534a7ad\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " Oct 01 12:48:05 crc kubenswrapper[4727]: I1001 12:48:05.568829 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/366b7e92-ea45-4052-8ddc-9540d534a7ad-service-ca" (OuterVolumeSpecName: "service-ca") pod "366b7e92-ea45-4052-8ddc-9540d534a7ad" (UID: "366b7e92-ea45-4052-8ddc-9540d534a7ad"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:48:05 crc kubenswrapper[4727]: I1001 12:48:05.568882 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/366b7e92-ea45-4052-8ddc-9540d534a7ad-console-config\") pod \"366b7e92-ea45-4052-8ddc-9540d534a7ad\" (UID: \"366b7e92-ea45-4052-8ddc-9540d534a7ad\") " Oct 01 12:48:05 crc kubenswrapper[4727]: I1001 12:48:05.568927 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/366b7e92-ea45-4052-8ddc-9540d534a7ad-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "366b7e92-ea45-4052-8ddc-9540d534a7ad" (UID: "366b7e92-ea45-4052-8ddc-9540d534a7ad"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:48:05 crc kubenswrapper[4727]: I1001 12:48:05.568988 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/366b7e92-ea45-4052-8ddc-9540d534a7ad-console-config" (OuterVolumeSpecName: "console-config") pod "366b7e92-ea45-4052-8ddc-9540d534a7ad" (UID: "366b7e92-ea45-4052-8ddc-9540d534a7ad"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:48:05 crc kubenswrapper[4727]: I1001 12:48:05.569164 4727 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/366b7e92-ea45-4052-8ddc-9540d534a7ad-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:48:05 crc kubenswrapper[4727]: I1001 12:48:05.569184 4727 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/366b7e92-ea45-4052-8ddc-9540d534a7ad-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:48:05 crc kubenswrapper[4727]: I1001 12:48:05.569195 4727 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/366b7e92-ea45-4052-8ddc-9540d534a7ad-console-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:48:05 crc kubenswrapper[4727]: I1001 12:48:05.569213 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/366b7e92-ea45-4052-8ddc-9540d534a7ad-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "366b7e92-ea45-4052-8ddc-9540d534a7ad" (UID: "366b7e92-ea45-4052-8ddc-9540d534a7ad"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:48:05 crc kubenswrapper[4727]: I1001 12:48:05.575416 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/366b7e92-ea45-4052-8ddc-9540d534a7ad-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "366b7e92-ea45-4052-8ddc-9540d534a7ad" (UID: "366b7e92-ea45-4052-8ddc-9540d534a7ad"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:48:05 crc kubenswrapper[4727]: I1001 12:48:05.575453 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/366b7e92-ea45-4052-8ddc-9540d534a7ad-kube-api-access-2xt8w" (OuterVolumeSpecName: "kube-api-access-2xt8w") pod "366b7e92-ea45-4052-8ddc-9540d534a7ad" (UID: "366b7e92-ea45-4052-8ddc-9540d534a7ad"). InnerVolumeSpecName "kube-api-access-2xt8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:48:05 crc kubenswrapper[4727]: I1001 12:48:05.575911 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/366b7e92-ea45-4052-8ddc-9540d534a7ad-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "366b7e92-ea45-4052-8ddc-9540d534a7ad" (UID: "366b7e92-ea45-4052-8ddc-9540d534a7ad"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:48:05 crc kubenswrapper[4727]: I1001 12:48:05.670679 4727 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/366b7e92-ea45-4052-8ddc-9540d534a7ad-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:48:05 crc kubenswrapper[4727]: I1001 12:48:05.670719 4727 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/366b7e92-ea45-4052-8ddc-9540d534a7ad-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:48:05 crc kubenswrapper[4727]: I1001 12:48:05.670732 4727 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/366b7e92-ea45-4052-8ddc-9540d534a7ad-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:48:05 crc kubenswrapper[4727]: I1001 12:48:05.670747 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xt8w\" (UniqueName: \"kubernetes.io/projected/366b7e92-ea45-4052-8ddc-9540d534a7ad-kube-api-access-2xt8w\") on node \"crc\" DevicePath \"\"" Oct 01 12:48:06 crc kubenswrapper[4727]: I1001 12:48:06.141042 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-89tz6_366b7e92-ea45-4052-8ddc-9540d534a7ad/console/0.log" Oct 01 12:48:06 crc kubenswrapper[4727]: I1001 12:48:06.141389 4727 generic.go:334] "Generic (PLEG): container finished" podID="366b7e92-ea45-4052-8ddc-9540d534a7ad" containerID="d6e6cc3e00f68f1a3411ffb99a4bd7381075b21f032ba8875d5d889f2c3922e0" exitCode=2 Oct 01 12:48:06 crc kubenswrapper[4727]: I1001 12:48:06.141465 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-89tz6" Oct 01 12:48:06 crc kubenswrapper[4727]: I1001 12:48:06.141485 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-89tz6" event={"ID":"366b7e92-ea45-4052-8ddc-9540d534a7ad","Type":"ContainerDied","Data":"d6e6cc3e00f68f1a3411ffb99a4bd7381075b21f032ba8875d5d889f2c3922e0"} Oct 01 12:48:06 crc kubenswrapper[4727]: I1001 12:48:06.141524 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-89tz6" event={"ID":"366b7e92-ea45-4052-8ddc-9540d534a7ad","Type":"ContainerDied","Data":"034be7a53c854ca94b0b98036eb730dad3d32ff6bc0701a35eea425000108c89"} Oct 01 12:48:06 crc kubenswrapper[4727]: I1001 12:48:06.141550 4727 scope.go:117] "RemoveContainer" containerID="d6e6cc3e00f68f1a3411ffb99a4bd7381075b21f032ba8875d5d889f2c3922e0" Oct 01 12:48:06 crc kubenswrapper[4727]: I1001 12:48:06.147024 4727 generic.go:334] "Generic (PLEG): container finished" podID="3e0c2a76-78ea-4d0c-bfd4-541f729a20a0" containerID="3c4b7f8201262852124fb900b915376ca0dc8bb903f52afa1d498555fc64cb7b" exitCode=0 Oct 01 12:48:06 crc kubenswrapper[4727]: I1001 12:48:06.147076 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw" event={"ID":"3e0c2a76-78ea-4d0c-bfd4-541f729a20a0","Type":"ContainerDied","Data":"3c4b7f8201262852124fb900b915376ca0dc8bb903f52afa1d498555fc64cb7b"} Oct 01 12:48:06 crc kubenswrapper[4727]: I1001 12:48:06.162215 4727 scope.go:117] "RemoveContainer" containerID="d6e6cc3e00f68f1a3411ffb99a4bd7381075b21f032ba8875d5d889f2c3922e0" Oct 01 12:48:06 crc kubenswrapper[4727]: E1001 12:48:06.166644 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6e6cc3e00f68f1a3411ffb99a4bd7381075b21f032ba8875d5d889f2c3922e0\": container with ID starting with d6e6cc3e00f68f1a3411ffb99a4bd7381075b21f032ba8875d5d889f2c3922e0 not found: ID does not exist" containerID="d6e6cc3e00f68f1a3411ffb99a4bd7381075b21f032ba8875d5d889f2c3922e0" Oct 01 12:48:06 crc kubenswrapper[4727]: I1001 12:48:06.166718 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6e6cc3e00f68f1a3411ffb99a4bd7381075b21f032ba8875d5d889f2c3922e0"} err="failed to get container status \"d6e6cc3e00f68f1a3411ffb99a4bd7381075b21f032ba8875d5d889f2c3922e0\": rpc error: code = NotFound desc = could not find container \"d6e6cc3e00f68f1a3411ffb99a4bd7381075b21f032ba8875d5d889f2c3922e0\": container with ID starting with d6e6cc3e00f68f1a3411ffb99a4bd7381075b21f032ba8875d5d889f2c3922e0 not found: ID does not exist" Oct 01 12:48:06 crc kubenswrapper[4727]: I1001 12:48:06.177682 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-89tz6"] Oct 01 12:48:06 crc kubenswrapper[4727]: I1001 12:48:06.180847 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-89tz6"] Oct 01 12:48:06 crc kubenswrapper[4727]: I1001 12:48:06.378386 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="366b7e92-ea45-4052-8ddc-9540d534a7ad" path="/var/lib/kubelet/pods/366b7e92-ea45-4052-8ddc-9540d534a7ad/volumes" Oct 01 12:48:07 crc kubenswrapper[4727]: I1001 12:48:07.155387 4727 generic.go:334] "Generic (PLEG): container finished" podID="3e0c2a76-78ea-4d0c-bfd4-541f729a20a0" containerID="493cedf09d8b9fc34224e7b78765a97a9a8a45d840363ecb0712235dce431a11" exitCode=0 Oct 01 12:48:07 crc kubenswrapper[4727]: I1001 12:48:07.155499 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw" event={"ID":"3e0c2a76-78ea-4d0c-bfd4-541f729a20a0","Type":"ContainerDied","Data":"493cedf09d8b9fc34224e7b78765a97a9a8a45d840363ecb0712235dce431a11"} Oct 01 12:48:08 crc kubenswrapper[4727]: I1001 12:48:08.387551 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw" Oct 01 12:48:08 crc kubenswrapper[4727]: I1001 12:48:08.507191 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e0c2a76-78ea-4d0c-bfd4-541f729a20a0-bundle\") pod \"3e0c2a76-78ea-4d0c-bfd4-541f729a20a0\" (UID: \"3e0c2a76-78ea-4d0c-bfd4-541f729a20a0\") " Oct 01 12:48:08 crc kubenswrapper[4727]: I1001 12:48:08.507385 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcwpb\" (UniqueName: \"kubernetes.io/projected/3e0c2a76-78ea-4d0c-bfd4-541f729a20a0-kube-api-access-wcwpb\") pod \"3e0c2a76-78ea-4d0c-bfd4-541f729a20a0\" (UID: \"3e0c2a76-78ea-4d0c-bfd4-541f729a20a0\") " Oct 01 12:48:08 crc kubenswrapper[4727]: I1001 12:48:08.507459 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e0c2a76-78ea-4d0c-bfd4-541f729a20a0-util\") pod \"3e0c2a76-78ea-4d0c-bfd4-541f729a20a0\" (UID: \"3e0c2a76-78ea-4d0c-bfd4-541f729a20a0\") " Oct 01 12:48:08 crc kubenswrapper[4727]: I1001 12:48:08.508793 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e0c2a76-78ea-4d0c-bfd4-541f729a20a0-bundle" (OuterVolumeSpecName: "bundle") pod "3e0c2a76-78ea-4d0c-bfd4-541f729a20a0" (UID: "3e0c2a76-78ea-4d0c-bfd4-541f729a20a0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:48:08 crc kubenswrapper[4727]: I1001 12:48:08.513820 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e0c2a76-78ea-4d0c-bfd4-541f729a20a0-kube-api-access-wcwpb" (OuterVolumeSpecName: "kube-api-access-wcwpb") pod "3e0c2a76-78ea-4d0c-bfd4-541f729a20a0" (UID: "3e0c2a76-78ea-4d0c-bfd4-541f729a20a0"). InnerVolumeSpecName "kube-api-access-wcwpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:48:08 crc kubenswrapper[4727]: I1001 12:48:08.525065 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e0c2a76-78ea-4d0c-bfd4-541f729a20a0-util" (OuterVolumeSpecName: "util") pod "3e0c2a76-78ea-4d0c-bfd4-541f729a20a0" (UID: "3e0c2a76-78ea-4d0c-bfd4-541f729a20a0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:48:08 crc kubenswrapper[4727]: I1001 12:48:08.609723 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcwpb\" (UniqueName: \"kubernetes.io/projected/3e0c2a76-78ea-4d0c-bfd4-541f729a20a0-kube-api-access-wcwpb\") on node \"crc\" DevicePath \"\"" Oct 01 12:48:08 crc kubenswrapper[4727]: I1001 12:48:08.610299 4727 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e0c2a76-78ea-4d0c-bfd4-541f729a20a0-util\") on node \"crc\" DevicePath \"\"" Oct 01 12:48:08 crc kubenswrapper[4727]: I1001 12:48:08.610314 4727 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e0c2a76-78ea-4d0c-bfd4-541f729a20a0-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:48:09 crc kubenswrapper[4727]: I1001 12:48:09.170827 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw" event={"ID":"3e0c2a76-78ea-4d0c-bfd4-541f729a20a0","Type":"ContainerDied","Data":"bba258a2ffc6f4f67dcb67c2f6f2beefd290628e0291ea415ca44cad703d2909"} Oct 01 12:48:09 crc kubenswrapper[4727]: I1001 12:48:09.171147 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bba258a2ffc6f4f67dcb67c2f6f2beefd290628e0291ea415ca44cad703d2909" Oct 01 12:48:09 crc kubenswrapper[4727]: I1001 12:48:09.170925 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw" Oct 01 12:48:17 crc kubenswrapper[4727]: I1001 12:48:17.765603 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-78766f874-pgvrs"] Oct 01 12:48:17 crc kubenswrapper[4727]: E1001 12:48:17.767683 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e0c2a76-78ea-4d0c-bfd4-541f729a20a0" containerName="extract" Oct 01 12:48:17 crc kubenswrapper[4727]: I1001 12:48:17.767781 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e0c2a76-78ea-4d0c-bfd4-541f729a20a0" containerName="extract" Oct 01 12:48:17 crc kubenswrapper[4727]: E1001 12:48:17.767875 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e0c2a76-78ea-4d0c-bfd4-541f729a20a0" containerName="pull" Oct 01 12:48:17 crc kubenswrapper[4727]: I1001 12:48:17.767947 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e0c2a76-78ea-4d0c-bfd4-541f729a20a0" containerName="pull" Oct 01 12:48:17 crc kubenswrapper[4727]: E1001 12:48:17.768065 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="366b7e92-ea45-4052-8ddc-9540d534a7ad" containerName="console" Oct 01 12:48:17 crc kubenswrapper[4727]: I1001 12:48:17.768243 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="366b7e92-ea45-4052-8ddc-9540d534a7ad" containerName="console" Oct 01 12:48:17 crc kubenswrapper[4727]: E1001 12:48:17.768308 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e0c2a76-78ea-4d0c-bfd4-541f729a20a0" containerName="util" Oct 01 12:48:17 crc kubenswrapper[4727]: I1001 12:48:17.768372 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e0c2a76-78ea-4d0c-bfd4-541f729a20a0" containerName="util" Oct 01 12:48:17 crc kubenswrapper[4727]: I1001 12:48:17.768569 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e0c2a76-78ea-4d0c-bfd4-541f729a20a0" containerName="extract" Oct 01 12:48:17 crc kubenswrapper[4727]: I1001 12:48:17.768655 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="366b7e92-ea45-4052-8ddc-9540d534a7ad" containerName="console" Oct 01 12:48:17 crc kubenswrapper[4727]: I1001 12:48:17.769249 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-78766f874-pgvrs" Oct 01 12:48:17 crc kubenswrapper[4727]: W1001 12:48:17.771192 4727 reflector.go:561] object-"metallb-system"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Oct 01 12:48:17 crc kubenswrapper[4727]: E1001 12:48:17.771247 4727 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 12:48:17 crc kubenswrapper[4727]: W1001 12:48:17.771312 4727 reflector.go:561] object-"metallb-system"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Oct 01 12:48:17 crc kubenswrapper[4727]: E1001 12:48:17.771347 4727 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 12:48:17 crc kubenswrapper[4727]: W1001 12:48:17.771513 4727 reflector.go:561] object-"metallb-system"/"metallb-operator-webhook-server-cert": failed to list *v1.Secret: secrets "metallb-operator-webhook-server-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Oct 01 12:48:17 crc kubenswrapper[4727]: E1001 12:48:17.771556 4727 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"metallb-operator-webhook-server-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"metallb-operator-webhook-server-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 12:48:17 crc kubenswrapper[4727]: W1001 12:48:17.771618 4727 reflector.go:561] object-"metallb-system"/"metallb-operator-controller-manager-service-cert": failed to list *v1.Secret: secrets "metallb-operator-controller-manager-service-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Oct 01 12:48:17 crc kubenswrapper[4727]: E1001 12:48:17.771635 4727 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"metallb-operator-controller-manager-service-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"metallb-operator-controller-manager-service-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 12:48:17 crc kubenswrapper[4727]: W1001 12:48:17.771712 4727 reflector.go:561] object-"metallb-system"/"manager-account-dockercfg-nrhvr": failed to list *v1.Secret: secrets "manager-account-dockercfg-nrhvr" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Oct 01 12:48:17 crc kubenswrapper[4727]: E1001 12:48:17.771735 4727 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"manager-account-dockercfg-nrhvr\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"manager-account-dockercfg-nrhvr\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 12:48:17 crc kubenswrapper[4727]: I1001 12:48:17.790244 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-78766f874-pgvrs"] Oct 01 12:48:17 crc kubenswrapper[4727]: I1001 12:48:17.821633 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/158e3a7a-113f-4004-9c41-9676efc0e93a-apiservice-cert\") pod \"metallb-operator-controller-manager-78766f874-pgvrs\" (UID: \"158e3a7a-113f-4004-9c41-9676efc0e93a\") " pod="metallb-system/metallb-operator-controller-manager-78766f874-pgvrs" Oct 01 12:48:17 crc kubenswrapper[4727]: I1001 12:48:17.821695 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/158e3a7a-113f-4004-9c41-9676efc0e93a-webhook-cert\") pod \"metallb-operator-controller-manager-78766f874-pgvrs\" (UID: \"158e3a7a-113f-4004-9c41-9676efc0e93a\") " pod="metallb-system/metallb-operator-controller-manager-78766f874-pgvrs" Oct 01 12:48:17 crc kubenswrapper[4727]: I1001 12:48:17.821735 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jnt7\" (UniqueName: \"kubernetes.io/projected/158e3a7a-113f-4004-9c41-9676efc0e93a-kube-api-access-5jnt7\") pod \"metallb-operator-controller-manager-78766f874-pgvrs\" (UID: \"158e3a7a-113f-4004-9c41-9676efc0e93a\") " pod="metallb-system/metallb-operator-controller-manager-78766f874-pgvrs" Oct 01 12:48:17 crc kubenswrapper[4727]: I1001 12:48:17.923570 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/158e3a7a-113f-4004-9c41-9676efc0e93a-webhook-cert\") pod \"metallb-operator-controller-manager-78766f874-pgvrs\" (UID: \"158e3a7a-113f-4004-9c41-9676efc0e93a\") " pod="metallb-system/metallb-operator-controller-manager-78766f874-pgvrs" Oct 01 12:48:17 crc kubenswrapper[4727]: I1001 12:48:17.923639 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jnt7\" (UniqueName: \"kubernetes.io/projected/158e3a7a-113f-4004-9c41-9676efc0e93a-kube-api-access-5jnt7\") pod \"metallb-operator-controller-manager-78766f874-pgvrs\" (UID: \"158e3a7a-113f-4004-9c41-9676efc0e93a\") " pod="metallb-system/metallb-operator-controller-manager-78766f874-pgvrs" Oct 01 12:48:17 crc kubenswrapper[4727]: I1001 12:48:17.923738 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/158e3a7a-113f-4004-9c41-9676efc0e93a-apiservice-cert\") pod \"metallb-operator-controller-manager-78766f874-pgvrs\" (UID: \"158e3a7a-113f-4004-9c41-9676efc0e93a\") " pod="metallb-system/metallb-operator-controller-manager-78766f874-pgvrs" Oct 01 12:48:18 crc kubenswrapper[4727]: I1001 12:48:18.004267 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5c98f7df7d-57v2c"] Oct 01 12:48:18 crc kubenswrapper[4727]: I1001 12:48:18.004969 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5c98f7df7d-57v2c" Oct 01 12:48:18 crc kubenswrapper[4727]: I1001 12:48:18.006601 4727 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 01 12:48:18 crc kubenswrapper[4727]: I1001 12:48:18.007164 4727 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 01 12:48:18 crc kubenswrapper[4727]: I1001 12:48:18.007250 4727 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-qg9sw" Oct 01 12:48:18 crc kubenswrapper[4727]: I1001 12:48:18.015740 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5c98f7df7d-57v2c"] Oct 01 12:48:18 crc kubenswrapper[4727]: I1001 12:48:18.126207 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4871f70f-30b9-4353-a083-6c6913107fa1-webhook-cert\") pod \"metallb-operator-webhook-server-5c98f7df7d-57v2c\" (UID: \"4871f70f-30b9-4353-a083-6c6913107fa1\") " pod="metallb-system/metallb-operator-webhook-server-5c98f7df7d-57v2c" Oct 01 12:48:18 crc kubenswrapper[4727]: I1001 12:48:18.126269 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4871f70f-30b9-4353-a083-6c6913107fa1-apiservice-cert\") pod \"metallb-operator-webhook-server-5c98f7df7d-57v2c\" (UID: \"4871f70f-30b9-4353-a083-6c6913107fa1\") " pod="metallb-system/metallb-operator-webhook-server-5c98f7df7d-57v2c" Oct 01 12:48:18 crc kubenswrapper[4727]: I1001 12:48:18.126461 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4grg\" (UniqueName: \"kubernetes.io/projected/4871f70f-30b9-4353-a083-6c6913107fa1-kube-api-access-k4grg\") pod \"metallb-operator-webhook-server-5c98f7df7d-57v2c\" (UID: \"4871f70f-30b9-4353-a083-6c6913107fa1\") " pod="metallb-system/metallb-operator-webhook-server-5c98f7df7d-57v2c" Oct 01 12:48:18 crc kubenswrapper[4727]: I1001 12:48:18.227663 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4grg\" (UniqueName: \"kubernetes.io/projected/4871f70f-30b9-4353-a083-6c6913107fa1-kube-api-access-k4grg\") pod \"metallb-operator-webhook-server-5c98f7df7d-57v2c\" (UID: \"4871f70f-30b9-4353-a083-6c6913107fa1\") " pod="metallb-system/metallb-operator-webhook-server-5c98f7df7d-57v2c" Oct 01 12:48:18 crc kubenswrapper[4727]: I1001 12:48:18.227804 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4871f70f-30b9-4353-a083-6c6913107fa1-webhook-cert\") pod \"metallb-operator-webhook-server-5c98f7df7d-57v2c\" (UID: \"4871f70f-30b9-4353-a083-6c6913107fa1\") " pod="metallb-system/metallb-operator-webhook-server-5c98f7df7d-57v2c" Oct 01 12:48:18 crc kubenswrapper[4727]: I1001 12:48:18.227842 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4871f70f-30b9-4353-a083-6c6913107fa1-apiservice-cert\") pod \"metallb-operator-webhook-server-5c98f7df7d-57v2c\" (UID: \"4871f70f-30b9-4353-a083-6c6913107fa1\") " pod="metallb-system/metallb-operator-webhook-server-5c98f7df7d-57v2c" Oct 01 12:48:18 crc kubenswrapper[4727]: I1001 12:48:18.234887 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4871f70f-30b9-4353-a083-6c6913107fa1-webhook-cert\") pod \"metallb-operator-webhook-server-5c98f7df7d-57v2c\" (UID: \"4871f70f-30b9-4353-a083-6c6913107fa1\") " pod="metallb-system/metallb-operator-webhook-server-5c98f7df7d-57v2c" Oct 01 12:48:18 crc kubenswrapper[4727]: I1001 12:48:18.236933 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4871f70f-30b9-4353-a083-6c6913107fa1-apiservice-cert\") pod \"metallb-operator-webhook-server-5c98f7df7d-57v2c\" (UID: \"4871f70f-30b9-4353-a083-6c6913107fa1\") " pod="metallb-system/metallb-operator-webhook-server-5c98f7df7d-57v2c" Oct 01 12:48:18 crc kubenswrapper[4727]: I1001 12:48:18.613480 4727 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 01 12:48:18 crc kubenswrapper[4727]: I1001 12:48:18.619645 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/158e3a7a-113f-4004-9c41-9676efc0e93a-webhook-cert\") pod \"metallb-operator-controller-manager-78766f874-pgvrs\" (UID: \"158e3a7a-113f-4004-9c41-9676efc0e93a\") " pod="metallb-system/metallb-operator-controller-manager-78766f874-pgvrs" Oct 01 12:48:18 crc kubenswrapper[4727]: I1001 12:48:18.636432 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/158e3a7a-113f-4004-9c41-9676efc0e93a-apiservice-cert\") pod \"metallb-operator-controller-manager-78766f874-pgvrs\" (UID: \"158e3a7a-113f-4004-9c41-9676efc0e93a\") " pod="metallb-system/metallb-operator-controller-manager-78766f874-pgvrs" Oct 01 12:48:18 crc kubenswrapper[4727]: I1001 12:48:18.744518 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 01 12:48:18 crc kubenswrapper[4727]: I1001 12:48:18.895336 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 01 12:48:18 crc kubenswrapper[4727]: I1001 12:48:18.898880 4727 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 01 12:48:18 crc kubenswrapper[4727]: I1001 12:48:18.900815 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jnt7\" (UniqueName: \"kubernetes.io/projected/158e3a7a-113f-4004-9c41-9676efc0e93a-kube-api-access-5jnt7\") pod \"metallb-operator-controller-manager-78766f874-pgvrs\" (UID: \"158e3a7a-113f-4004-9c41-9676efc0e93a\") " pod="metallb-system/metallb-operator-controller-manager-78766f874-pgvrs" Oct 01 12:48:18 crc kubenswrapper[4727]: I1001 12:48:18.900870 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4grg\" (UniqueName: \"kubernetes.io/projected/4871f70f-30b9-4353-a083-6c6913107fa1-kube-api-access-k4grg\") pod \"metallb-operator-webhook-server-5c98f7df7d-57v2c\" (UID: \"4871f70f-30b9-4353-a083-6c6913107fa1\") " pod="metallb-system/metallb-operator-webhook-server-5c98f7df7d-57v2c" Oct 01 12:48:18 crc kubenswrapper[4727]: I1001 12:48:18.914228 4727 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-nrhvr" Oct 01 12:48:18 crc kubenswrapper[4727]: I1001 12:48:18.920388 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5c98f7df7d-57v2c" Oct 01 12:48:18 crc kubenswrapper[4727]: I1001 12:48:18.986508 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-78766f874-pgvrs" Oct 01 12:48:19 crc kubenswrapper[4727]: I1001 12:48:19.149020 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5c98f7df7d-57v2c"] Oct 01 12:48:19 crc kubenswrapper[4727]: I1001 12:48:19.225445 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5c98f7df7d-57v2c" event={"ID":"4871f70f-30b9-4353-a083-6c6913107fa1","Type":"ContainerStarted","Data":"cddcdd4d6ca7a7dd3d4eda29c16ff47ec2d3fe6cf387c583db5a7d1f3903ec5c"} Oct 01 12:48:19 crc kubenswrapper[4727]: I1001 12:48:19.226540 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-78766f874-pgvrs"] Oct 01 12:48:19 crc kubenswrapper[4727]: W1001 12:48:19.228613 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod158e3a7a_113f_4004_9c41_9676efc0e93a.slice/crio-76317412f29e586279b99d692189956e7de1df1bdef62b3e4b79608c7ea997ea WatchSource:0}: Error finding container 76317412f29e586279b99d692189956e7de1df1bdef62b3e4b79608c7ea997ea: Status 404 returned error can't find the container with id 76317412f29e586279b99d692189956e7de1df1bdef62b3e4b79608c7ea997ea Oct 01 12:48:20 crc kubenswrapper[4727]: I1001 12:48:20.239424 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-78766f874-pgvrs" event={"ID":"158e3a7a-113f-4004-9c41-9676efc0e93a","Type":"ContainerStarted","Data":"76317412f29e586279b99d692189956e7de1df1bdef62b3e4b79608c7ea997ea"} Oct 01 12:48:26 crc kubenswrapper[4727]: I1001 12:48:26.271390 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5c98f7df7d-57v2c" event={"ID":"4871f70f-30b9-4353-a083-6c6913107fa1","Type":"ContainerStarted","Data":"3b547947baeb2e435211d97b8e66ff9490b35d76ea15bd487e5e7b23197af4a0"} Oct 01 12:48:26 crc kubenswrapper[4727]: I1001 12:48:26.272179 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5c98f7df7d-57v2c" Oct 01 12:48:26 crc kubenswrapper[4727]: I1001 12:48:26.273070 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-78766f874-pgvrs" event={"ID":"158e3a7a-113f-4004-9c41-9676efc0e93a","Type":"ContainerStarted","Data":"b77c351589923254eb3335f0c75eede2022ce0cfe66ab44fbb961998d89f0301"} Oct 01 12:48:26 crc kubenswrapper[4727]: I1001 12:48:26.273251 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-78766f874-pgvrs" Oct 01 12:48:26 crc kubenswrapper[4727]: I1001 12:48:26.293482 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5c98f7df7d-57v2c" podStartSLOduration=3.069573485 podStartE2EDuration="9.293464693s" podCreationTimestamp="2025-10-01 12:48:17 +0000 UTC" firstStartedPulling="2025-10-01 12:48:19.163579494 +0000 UTC m=+677.484934331" lastFinishedPulling="2025-10-01 12:48:25.387470702 +0000 UTC m=+683.708825539" observedRunningTime="2025-10-01 12:48:26.290979997 +0000 UTC m=+684.612334854" watchObservedRunningTime="2025-10-01 12:48:26.293464693 +0000 UTC m=+684.614819530" Oct 01 12:48:26 crc kubenswrapper[4727]: I1001 12:48:26.314197 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-78766f874-pgvrs" podStartSLOduration=3.179569924 podStartE2EDuration="9.314178147s" podCreationTimestamp="2025-10-01 12:48:17 +0000 UTC" firstStartedPulling="2025-10-01 12:48:19.231884936 +0000 UTC m=+677.553239773" lastFinishedPulling="2025-10-01 12:48:25.366493159 +0000 UTC m=+683.687847996" observedRunningTime="2025-10-01 12:48:26.309409482 +0000 UTC m=+684.630764339" watchObservedRunningTime="2025-10-01 12:48:26.314178147 +0000 UTC m=+684.635532994" Oct 01 12:48:33 crc kubenswrapper[4727]: I1001 12:48:33.292432 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:48:33 crc kubenswrapper[4727]: I1001 12:48:33.293067 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:48:38 crc kubenswrapper[4727]: I1001 12:48:38.925680 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5c98f7df7d-57v2c" Oct 01 12:48:58 crc kubenswrapper[4727]: I1001 12:48:58.989158 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-78766f874-pgvrs" Oct 01 12:48:59 crc kubenswrapper[4727]: I1001 12:48:59.873843 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-4nbzc"] Oct 01 12:48:59 crc kubenswrapper[4727]: I1001 12:48:59.875863 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4nbzc" Oct 01 12:48:59 crc kubenswrapper[4727]: I1001 12:48:59.878594 4727 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-fjc6d" Oct 01 12:48:59 crc kubenswrapper[4727]: I1001 12:48:59.879170 4727 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 01 12:48:59 crc kubenswrapper[4727]: I1001 12:48:59.879259 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-rtkfs"] Oct 01 12:48:59 crc kubenswrapper[4727]: I1001 12:48:59.882653 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-rtkfs" Oct 01 12:48:59 crc kubenswrapper[4727]: I1001 12:48:59.884289 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 01 12:48:59 crc kubenswrapper[4727]: I1001 12:48:59.884879 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-4nbzc"] Oct 01 12:48:59 crc kubenswrapper[4727]: I1001 12:48:59.888524 4727 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 01 12:48:59 crc kubenswrapper[4727]: I1001 12:48:59.962121 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-p277d"] Oct 01 12:48:59 crc kubenswrapper[4727]: I1001 12:48:59.963031 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-p277d" Oct 01 12:48:59 crc kubenswrapper[4727]: I1001 12:48:59.965849 4727 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 01 12:48:59 crc kubenswrapper[4727]: I1001 12:48:59.967349 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 01 12:48:59 crc kubenswrapper[4727]: I1001 12:48:59.967400 4727 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-w8z4l" Oct 01 12:48:59 crc kubenswrapper[4727]: I1001 12:48:59.967902 4727 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 01 12:48:59 crc kubenswrapper[4727]: I1001 12:48:59.981831 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84740a24-d4ce-4fea-83c6-c79d664ee07b-cert\") pod \"frr-k8s-webhook-server-5478bdb765-4nbzc\" (UID: \"84740a24-d4ce-4fea-83c6-c79d664ee07b\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4nbzc" Oct 01 12:48:59 crc kubenswrapper[4727]: I1001 12:48:59.981898 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdfl4\" (UniqueName: \"kubernetes.io/projected/84740a24-d4ce-4fea-83c6-c79d664ee07b-kube-api-access-kdfl4\") pod \"frr-k8s-webhook-server-5478bdb765-4nbzc\" (UID: \"84740a24-d4ce-4fea-83c6-c79d664ee07b\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4nbzc" Oct 01 12:48:59 crc kubenswrapper[4727]: I1001 12:48:59.987979 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5d688f5ffc-w2n4b"] Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.002481 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-w2n4b" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.010015 4727 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.031572 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-w2n4b"] Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.085671 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63cffc61-9887-47ef-85a1-7c6705e44845-metrics-certs\") pod \"speaker-p277d\" (UID: \"63cffc61-9887-47ef-85a1-7c6705e44845\") " pod="metallb-system/speaker-p277d" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.085738 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/63cffc61-9887-47ef-85a1-7c6705e44845-metallb-excludel2\") pod \"speaker-p277d\" (UID: \"63cffc61-9887-47ef-85a1-7c6705e44845\") " pod="metallb-system/speaker-p277d" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.085789 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5361bcf2-5da0-41fa-8c27-3507e59217f9-frr-conf\") pod \"frr-k8s-rtkfs\" (UID: \"5361bcf2-5da0-41fa-8c27-3507e59217f9\") " pod="metallb-system/frr-k8s-rtkfs" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.085811 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5361bcf2-5da0-41fa-8c27-3507e59217f9-frr-sockets\") pod \"frr-k8s-rtkfs\" (UID: \"5361bcf2-5da0-41fa-8c27-3507e59217f9\") " pod="metallb-system/frr-k8s-rtkfs" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.085829 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/63cffc61-9887-47ef-85a1-7c6705e44845-memberlist\") pod \"speaker-p277d\" (UID: \"63cffc61-9887-47ef-85a1-7c6705e44845\") " pod="metallb-system/speaker-p277d" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.085860 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84740a24-d4ce-4fea-83c6-c79d664ee07b-cert\") pod \"frr-k8s-webhook-server-5478bdb765-4nbzc\" (UID: \"84740a24-d4ce-4fea-83c6-c79d664ee07b\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4nbzc" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.085879 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5361bcf2-5da0-41fa-8c27-3507e59217f9-frr-startup\") pod \"frr-k8s-rtkfs\" (UID: \"5361bcf2-5da0-41fa-8c27-3507e59217f9\") " pod="metallb-system/frr-k8s-rtkfs" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.085900 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdfl4\" (UniqueName: \"kubernetes.io/projected/84740a24-d4ce-4fea-83c6-c79d664ee07b-kube-api-access-kdfl4\") pod \"frr-k8s-webhook-server-5478bdb765-4nbzc\" (UID: \"84740a24-d4ce-4fea-83c6-c79d664ee07b\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4nbzc" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.085918 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5361bcf2-5da0-41fa-8c27-3507e59217f9-metrics-certs\") pod \"frr-k8s-rtkfs\" (UID: \"5361bcf2-5da0-41fa-8c27-3507e59217f9\") " pod="metallb-system/frr-k8s-rtkfs" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.085943 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5361bcf2-5da0-41fa-8c27-3507e59217f9-metrics\") pod \"frr-k8s-rtkfs\" (UID: \"5361bcf2-5da0-41fa-8c27-3507e59217f9\") " pod="metallb-system/frr-k8s-rtkfs" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.085962 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs2wz\" (UniqueName: \"kubernetes.io/projected/5361bcf2-5da0-41fa-8c27-3507e59217f9-kube-api-access-rs2wz\") pod \"frr-k8s-rtkfs\" (UID: \"5361bcf2-5da0-41fa-8c27-3507e59217f9\") " pod="metallb-system/frr-k8s-rtkfs" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.085985 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxzk7\" (UniqueName: \"kubernetes.io/projected/63cffc61-9887-47ef-85a1-7c6705e44845-kube-api-access-zxzk7\") pod \"speaker-p277d\" (UID: \"63cffc61-9887-47ef-85a1-7c6705e44845\") " pod="metallb-system/speaker-p277d" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.086023 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5361bcf2-5da0-41fa-8c27-3507e59217f9-reloader\") pod \"frr-k8s-rtkfs\" (UID: \"5361bcf2-5da0-41fa-8c27-3507e59217f9\") " pod="metallb-system/frr-k8s-rtkfs" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.093894 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84740a24-d4ce-4fea-83c6-c79d664ee07b-cert\") pod \"frr-k8s-webhook-server-5478bdb765-4nbzc\" (UID: \"84740a24-d4ce-4fea-83c6-c79d664ee07b\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4nbzc" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.145891 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdfl4\" (UniqueName: \"kubernetes.io/projected/84740a24-d4ce-4fea-83c6-c79d664ee07b-kube-api-access-kdfl4\") pod \"frr-k8s-webhook-server-5478bdb765-4nbzc\" (UID: \"84740a24-d4ce-4fea-83c6-c79d664ee07b\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4nbzc" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.187198 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5361bcf2-5da0-41fa-8c27-3507e59217f9-frr-conf\") pod \"frr-k8s-rtkfs\" (UID: \"5361bcf2-5da0-41fa-8c27-3507e59217f9\") " pod="metallb-system/frr-k8s-rtkfs" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.187266 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f3c3856-d38f-49ec-8930-10cd2f8f2b61-metrics-certs\") pod \"controller-5d688f5ffc-w2n4b\" (UID: \"4f3c3856-d38f-49ec-8930-10cd2f8f2b61\") " pod="metallb-system/controller-5d688f5ffc-w2n4b" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.187290 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5361bcf2-5da0-41fa-8c27-3507e59217f9-frr-sockets\") pod \"frr-k8s-rtkfs\" (UID: \"5361bcf2-5da0-41fa-8c27-3507e59217f9\") " pod="metallb-system/frr-k8s-rtkfs" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.187310 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/63cffc61-9887-47ef-85a1-7c6705e44845-memberlist\") pod \"speaker-p277d\" (UID: \"63cffc61-9887-47ef-85a1-7c6705e44845\") " pod="metallb-system/speaker-p277d" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.187336 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5361bcf2-5da0-41fa-8c27-3507e59217f9-frr-startup\") pod \"frr-k8s-rtkfs\" (UID: \"5361bcf2-5da0-41fa-8c27-3507e59217f9\") " pod="metallb-system/frr-k8s-rtkfs" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.187352 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5361bcf2-5da0-41fa-8c27-3507e59217f9-metrics-certs\") pod \"frr-k8s-rtkfs\" (UID: \"5361bcf2-5da0-41fa-8c27-3507e59217f9\") " pod="metallb-system/frr-k8s-rtkfs" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.187370 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f3c3856-d38f-49ec-8930-10cd2f8f2b61-cert\") pod \"controller-5d688f5ffc-w2n4b\" (UID: \"4f3c3856-d38f-49ec-8930-10cd2f8f2b61\") " pod="metallb-system/controller-5d688f5ffc-w2n4b" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.187389 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5361bcf2-5da0-41fa-8c27-3507e59217f9-metrics\") pod \"frr-k8s-rtkfs\" (UID: \"5361bcf2-5da0-41fa-8c27-3507e59217f9\") " pod="metallb-system/frr-k8s-rtkfs" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.187404 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs2wz\" (UniqueName: \"kubernetes.io/projected/5361bcf2-5da0-41fa-8c27-3507e59217f9-kube-api-access-rs2wz\") pod \"frr-k8s-rtkfs\" (UID: \"5361bcf2-5da0-41fa-8c27-3507e59217f9\") " pod="metallb-system/frr-k8s-rtkfs" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.187421 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25hsb\" (UniqueName: \"kubernetes.io/projected/4f3c3856-d38f-49ec-8930-10cd2f8f2b61-kube-api-access-25hsb\") pod \"controller-5d688f5ffc-w2n4b\" (UID: \"4f3c3856-d38f-49ec-8930-10cd2f8f2b61\") " pod="metallb-system/controller-5d688f5ffc-w2n4b" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.187443 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxzk7\" (UniqueName: \"kubernetes.io/projected/63cffc61-9887-47ef-85a1-7c6705e44845-kube-api-access-zxzk7\") pod \"speaker-p277d\" (UID: \"63cffc61-9887-47ef-85a1-7c6705e44845\") " pod="metallb-system/speaker-p277d" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.187465 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5361bcf2-5da0-41fa-8c27-3507e59217f9-reloader\") pod \"frr-k8s-rtkfs\" (UID: \"5361bcf2-5da0-41fa-8c27-3507e59217f9\") " pod="metallb-system/frr-k8s-rtkfs" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.187497 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63cffc61-9887-47ef-85a1-7c6705e44845-metrics-certs\") pod \"speaker-p277d\" (UID: \"63cffc61-9887-47ef-85a1-7c6705e44845\") " pod="metallb-system/speaker-p277d" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.187523 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/63cffc61-9887-47ef-85a1-7c6705e44845-metallb-excludel2\") pod \"speaker-p277d\" (UID: \"63cffc61-9887-47ef-85a1-7c6705e44845\") " pod="metallb-system/speaker-p277d" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.188301 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/63cffc61-9887-47ef-85a1-7c6705e44845-metallb-excludel2\") pod \"speaker-p277d\" (UID: \"63cffc61-9887-47ef-85a1-7c6705e44845\") " pod="metallb-system/speaker-p277d" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.188640 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5361bcf2-5da0-41fa-8c27-3507e59217f9-frr-conf\") pod \"frr-k8s-rtkfs\" (UID: \"5361bcf2-5da0-41fa-8c27-3507e59217f9\") " pod="metallb-system/frr-k8s-rtkfs" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.188879 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5361bcf2-5da0-41fa-8c27-3507e59217f9-frr-sockets\") pod \"frr-k8s-rtkfs\" (UID: \"5361bcf2-5da0-41fa-8c27-3507e59217f9\") " pod="metallb-system/frr-k8s-rtkfs" Oct 01 12:49:00 crc kubenswrapper[4727]: E1001 12:49:00.188953 4727 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 01 12:49:00 crc kubenswrapper[4727]: E1001 12:49:00.189016 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63cffc61-9887-47ef-85a1-7c6705e44845-memberlist podName:63cffc61-9887-47ef-85a1-7c6705e44845 nodeName:}" failed. No retries permitted until 2025-10-01 12:49:00.688984614 +0000 UTC m=+719.010339451 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/63cffc61-9887-47ef-85a1-7c6705e44845-memberlist") pod "speaker-p277d" (UID: "63cffc61-9887-47ef-85a1-7c6705e44845") : secret "metallb-memberlist" not found Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.189579 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5361bcf2-5da0-41fa-8c27-3507e59217f9-reloader\") pod \"frr-k8s-rtkfs\" (UID: \"5361bcf2-5da0-41fa-8c27-3507e59217f9\") " pod="metallb-system/frr-k8s-rtkfs" Oct 01 12:49:00 crc kubenswrapper[4727]: E1001 12:49:00.189659 4727 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Oct 01 12:49:00 crc kubenswrapper[4727]: E1001 12:49:00.189693 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63cffc61-9887-47ef-85a1-7c6705e44845-metrics-certs podName:63cffc61-9887-47ef-85a1-7c6705e44845 nodeName:}" failed. No retries permitted until 2025-10-01 12:49:00.689682486 +0000 UTC m=+719.011037323 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63cffc61-9887-47ef-85a1-7c6705e44845-metrics-certs") pod "speaker-p277d" (UID: "63cffc61-9887-47ef-85a1-7c6705e44845") : secret "speaker-certs-secret" not found Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.190451 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5361bcf2-5da0-41fa-8c27-3507e59217f9-frr-startup\") pod \"frr-k8s-rtkfs\" (UID: \"5361bcf2-5da0-41fa-8c27-3507e59217f9\") " pod="metallb-system/frr-k8s-rtkfs" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.190704 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5361bcf2-5da0-41fa-8c27-3507e59217f9-metrics\") pod \"frr-k8s-rtkfs\" (UID: \"5361bcf2-5da0-41fa-8c27-3507e59217f9\") " pod="metallb-system/frr-k8s-rtkfs" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.193274 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5361bcf2-5da0-41fa-8c27-3507e59217f9-metrics-certs\") pod \"frr-k8s-rtkfs\" (UID: \"5361bcf2-5da0-41fa-8c27-3507e59217f9\") " pod="metallb-system/frr-k8s-rtkfs" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.210432 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4nbzc" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.217673 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxzk7\" (UniqueName: \"kubernetes.io/projected/63cffc61-9887-47ef-85a1-7c6705e44845-kube-api-access-zxzk7\") pod \"speaker-p277d\" (UID: \"63cffc61-9887-47ef-85a1-7c6705e44845\") " pod="metallb-system/speaker-p277d" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.226634 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs2wz\" (UniqueName: \"kubernetes.io/projected/5361bcf2-5da0-41fa-8c27-3507e59217f9-kube-api-access-rs2wz\") pod \"frr-k8s-rtkfs\" (UID: \"5361bcf2-5da0-41fa-8c27-3507e59217f9\") " pod="metallb-system/frr-k8s-rtkfs" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.288621 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f3c3856-d38f-49ec-8930-10cd2f8f2b61-metrics-certs\") pod \"controller-5d688f5ffc-w2n4b\" (UID: \"4f3c3856-d38f-49ec-8930-10cd2f8f2b61\") " pod="metallb-system/controller-5d688f5ffc-w2n4b" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.288695 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f3c3856-d38f-49ec-8930-10cd2f8f2b61-cert\") pod \"controller-5d688f5ffc-w2n4b\" (UID: \"4f3c3856-d38f-49ec-8930-10cd2f8f2b61\") " pod="metallb-system/controller-5d688f5ffc-w2n4b" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.288728 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25hsb\" (UniqueName: \"kubernetes.io/projected/4f3c3856-d38f-49ec-8930-10cd2f8f2b61-kube-api-access-25hsb\") pod \"controller-5d688f5ffc-w2n4b\" (UID: \"4f3c3856-d38f-49ec-8930-10cd2f8f2b61\") " pod="metallb-system/controller-5d688f5ffc-w2n4b" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.292231 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f3c3856-d38f-49ec-8930-10cd2f8f2b61-cert\") pod \"controller-5d688f5ffc-w2n4b\" (UID: \"4f3c3856-d38f-49ec-8930-10cd2f8f2b61\") " pod="metallb-system/controller-5d688f5ffc-w2n4b" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.292476 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f3c3856-d38f-49ec-8930-10cd2f8f2b61-metrics-certs\") pod \"controller-5d688f5ffc-w2n4b\" (UID: \"4f3c3856-d38f-49ec-8930-10cd2f8f2b61\") " pod="metallb-system/controller-5d688f5ffc-w2n4b" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.319047 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25hsb\" (UniqueName: \"kubernetes.io/projected/4f3c3856-d38f-49ec-8930-10cd2f8f2b61-kube-api-access-25hsb\") pod \"controller-5d688f5ffc-w2n4b\" (UID: \"4f3c3856-d38f-49ec-8930-10cd2f8f2b61\") " pod="metallb-system/controller-5d688f5ffc-w2n4b" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.332314 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-w2n4b" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.480235 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-4nbzc"] Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.519526 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-rtkfs" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.694777 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63cffc61-9887-47ef-85a1-7c6705e44845-metrics-certs\") pod \"speaker-p277d\" (UID: \"63cffc61-9887-47ef-85a1-7c6705e44845\") " pod="metallb-system/speaker-p277d" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.694888 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/63cffc61-9887-47ef-85a1-7c6705e44845-memberlist\") pod \"speaker-p277d\" (UID: \"63cffc61-9887-47ef-85a1-7c6705e44845\") " pod="metallb-system/speaker-p277d" Oct 01 12:49:00 crc kubenswrapper[4727]: E1001 12:49:00.695111 4727 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 01 12:49:00 crc kubenswrapper[4727]: E1001 12:49:00.695202 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63cffc61-9887-47ef-85a1-7c6705e44845-memberlist podName:63cffc61-9887-47ef-85a1-7c6705e44845 nodeName:}" failed. No retries permitted until 2025-10-01 12:49:01.695177401 +0000 UTC m=+720.016532238 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/63cffc61-9887-47ef-85a1-7c6705e44845-memberlist") pod "speaker-p277d" (UID: "63cffc61-9887-47ef-85a1-7c6705e44845") : secret "metallb-memberlist" not found Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.700635 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63cffc61-9887-47ef-85a1-7c6705e44845-metrics-certs\") pod \"speaker-p277d\" (UID: \"63cffc61-9887-47ef-85a1-7c6705e44845\") " pod="metallb-system/speaker-p277d" Oct 01 12:49:00 crc kubenswrapper[4727]: I1001 12:49:00.749500 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-w2n4b"] Oct 01 12:49:00 crc kubenswrapper[4727]: W1001 12:49:00.760848 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f3c3856_d38f_49ec_8930_10cd2f8f2b61.slice/crio-6a296c78e4597f40e8f8d4fb41b9ce2928ad6c2887de7226a7fd276b503573e7 WatchSource:0}: Error finding container 6a296c78e4597f40e8f8d4fb41b9ce2928ad6c2887de7226a7fd276b503573e7: Status 404 returned error can't find the container with id 6a296c78e4597f40e8f8d4fb41b9ce2928ad6c2887de7226a7fd276b503573e7 Oct 01 12:49:01 crc kubenswrapper[4727]: I1001 12:49:01.477478 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4nbzc" event={"ID":"84740a24-d4ce-4fea-83c6-c79d664ee07b","Type":"ContainerStarted","Data":"d3cfd05562d74c196d6e1718c2fcd95b38d03564f74df7c2d2c8c49720d922c6"} Oct 01 12:49:01 crc kubenswrapper[4727]: I1001 12:49:01.478782 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rtkfs" event={"ID":"5361bcf2-5da0-41fa-8c27-3507e59217f9","Type":"ContainerStarted","Data":"964dec64d8bb9023109cbd54e043432b5dc7c2ee630d70282e0d1b90bcaec28d"} Oct 01 12:49:01 crc kubenswrapper[4727]: I1001 12:49:01.480933 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-w2n4b" event={"ID":"4f3c3856-d38f-49ec-8930-10cd2f8f2b61","Type":"ContainerStarted","Data":"1fce55978d719ba90da5df6128f5bd13d6a2d91d3cf0b70c6585407634729714"} Oct 01 12:49:01 crc kubenswrapper[4727]: I1001 12:49:01.480971 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-w2n4b" event={"ID":"4f3c3856-d38f-49ec-8930-10cd2f8f2b61","Type":"ContainerStarted","Data":"6a296c78e4597f40e8f8d4fb41b9ce2928ad6c2887de7226a7fd276b503573e7"} Oct 01 12:49:01 crc kubenswrapper[4727]: I1001 12:49:01.708017 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/63cffc61-9887-47ef-85a1-7c6705e44845-memberlist\") pod \"speaker-p277d\" (UID: \"63cffc61-9887-47ef-85a1-7c6705e44845\") " pod="metallb-system/speaker-p277d" Oct 01 12:49:01 crc kubenswrapper[4727]: I1001 12:49:01.721636 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/63cffc61-9887-47ef-85a1-7c6705e44845-memberlist\") pod \"speaker-p277d\" (UID: \"63cffc61-9887-47ef-85a1-7c6705e44845\") " pod="metallb-system/speaker-p277d" Oct 01 12:49:01 crc kubenswrapper[4727]: I1001 12:49:01.780706 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-p277d" Oct 01 12:49:01 crc kubenswrapper[4727]: W1001 12:49:01.810852 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63cffc61_9887_47ef_85a1_7c6705e44845.slice/crio-c8b5c93319c20efbd7047af0d0b95aa1a646aeead4c1446a2075ccc4c8053c49 WatchSource:0}: Error finding container c8b5c93319c20efbd7047af0d0b95aa1a646aeead4c1446a2075ccc4c8053c49: Status 404 returned error can't find the container with id c8b5c93319c20efbd7047af0d0b95aa1a646aeead4c1446a2075ccc4c8053c49 Oct 01 12:49:02 crc kubenswrapper[4727]: I1001 12:49:02.500099 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-p277d" event={"ID":"63cffc61-9887-47ef-85a1-7c6705e44845","Type":"ContainerStarted","Data":"90ef9baf2ede525b4b3fb4527d52bf18b4390244531ec34e6e459a56de0ce27b"} Oct 01 12:49:02 crc kubenswrapper[4727]: I1001 12:49:02.500794 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-p277d" event={"ID":"63cffc61-9887-47ef-85a1-7c6705e44845","Type":"ContainerStarted","Data":"c8b5c93319c20efbd7047af0d0b95aa1a646aeead4c1446a2075ccc4c8053c49"} Oct 01 12:49:02 crc kubenswrapper[4727]: I1001 12:49:02.505016 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-w2n4b" event={"ID":"4f3c3856-d38f-49ec-8930-10cd2f8f2b61","Type":"ContainerStarted","Data":"32c6882a276a4b6cde4a43189ec018fdf638cf8a051142f0dd1a6b0ea9eb15ed"} Oct 01 12:49:02 crc kubenswrapper[4727]: I1001 12:49:02.505358 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5d688f5ffc-w2n4b" Oct 01 12:49:02 crc kubenswrapper[4727]: I1001 12:49:02.539523 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5d688f5ffc-w2n4b" podStartSLOduration=3.539504967 podStartE2EDuration="3.539504967s" podCreationTimestamp="2025-10-01 12:48:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:49:02.534476407 +0000 UTC m=+720.855831244" watchObservedRunningTime="2025-10-01 12:49:02.539504967 +0000 UTC m=+720.860859804" Oct 01 12:49:03 crc kubenswrapper[4727]: I1001 12:49:03.295042 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:49:03 crc kubenswrapper[4727]: I1001 12:49:03.295598 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:49:03 crc kubenswrapper[4727]: I1001 12:49:03.515395 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-p277d" event={"ID":"63cffc61-9887-47ef-85a1-7c6705e44845","Type":"ContainerStarted","Data":"e79fc294283523d97b9c781ff058ca6bee8c2f24abe966650cf5973033453d82"} Oct 01 12:49:03 crc kubenswrapper[4727]: I1001 12:49:03.516191 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-p277d" Oct 01 12:49:03 crc kubenswrapper[4727]: I1001 12:49:03.540412 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-p277d" podStartSLOduration=4.540395754 podStartE2EDuration="4.540395754s" podCreationTimestamp="2025-10-01 12:48:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:49:03.537501032 +0000 UTC m=+721.858855869" watchObservedRunningTime="2025-10-01 12:49:03.540395754 +0000 UTC m=+721.861750591" Oct 01 12:49:12 crc kubenswrapper[4727]: I1001 12:49:12.584825 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4nbzc" event={"ID":"84740a24-d4ce-4fea-83c6-c79d664ee07b","Type":"ContainerStarted","Data":"8df0dff0c405b5964319b9ab745112a3cd1f0d0a3d00c1f43be2cc1870aa0e6b"} Oct 01 12:49:12 crc kubenswrapper[4727]: I1001 12:49:12.585562 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4nbzc" Oct 01 12:49:12 crc kubenswrapper[4727]: I1001 12:49:12.587926 4727 generic.go:334] "Generic (PLEG): container finished" podID="5361bcf2-5da0-41fa-8c27-3507e59217f9" containerID="7c3a76a5b7f6c50919c223dee1e65d4ca35592fd0dbf6766b04bb272fdf0a89a" exitCode=0 Oct 01 12:49:12 crc kubenswrapper[4727]: I1001 12:49:12.588022 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rtkfs" event={"ID":"5361bcf2-5da0-41fa-8c27-3507e59217f9","Type":"ContainerDied","Data":"7c3a76a5b7f6c50919c223dee1e65d4ca35592fd0dbf6766b04bb272fdf0a89a"} Oct 01 12:49:12 crc kubenswrapper[4727]: I1001 12:49:12.606272 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4nbzc" podStartSLOduration=2.6005616959999998 podStartE2EDuration="13.606245821s" podCreationTimestamp="2025-10-01 12:48:59 +0000 UTC" firstStartedPulling="2025-10-01 12:49:00.500276775 +0000 UTC m=+718.821631612" lastFinishedPulling="2025-10-01 12:49:11.50596089 +0000 UTC m=+729.827315737" observedRunningTime="2025-10-01 12:49:12.602509183 +0000 UTC m=+730.923864040" watchObservedRunningTime="2025-10-01 12:49:12.606245821 +0000 UTC m=+730.927600658" Oct 01 12:49:13 crc kubenswrapper[4727]: I1001 12:49:13.594190 4727 generic.go:334] "Generic (PLEG): container finished" podID="5361bcf2-5da0-41fa-8c27-3507e59217f9" containerID="f7f39634d97faaae0a8f178a91e3b5a4bf3825735533dcd3542d1875f6b30875" exitCode=0 Oct 01 12:49:13 crc kubenswrapper[4727]: I1001 12:49:13.594259 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rtkfs" event={"ID":"5361bcf2-5da0-41fa-8c27-3507e59217f9","Type":"ContainerDied","Data":"f7f39634d97faaae0a8f178a91e3b5a4bf3825735533dcd3542d1875f6b30875"} Oct 01 12:49:14 crc kubenswrapper[4727]: I1001 12:49:14.600946 4727 generic.go:334] "Generic (PLEG): container finished" podID="5361bcf2-5da0-41fa-8c27-3507e59217f9" containerID="9d316e7a0fe92be67df4ede7da3d54e845d61e7193788414cfb68951b1944d0b" exitCode=0 Oct 01 12:49:14 crc kubenswrapper[4727]: I1001 12:49:14.600989 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rtkfs" event={"ID":"5361bcf2-5da0-41fa-8c27-3507e59217f9","Type":"ContainerDied","Data":"9d316e7a0fe92be67df4ede7da3d54e845d61e7193788414cfb68951b1944d0b"} Oct 01 12:49:15 crc kubenswrapper[4727]: I1001 12:49:15.612264 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rtkfs" event={"ID":"5361bcf2-5da0-41fa-8c27-3507e59217f9","Type":"ContainerStarted","Data":"845d61cb1c7bda042f19a0bb705ca1ca58541b89059d21941f07852429efce9d"} Oct 01 12:49:15 crc kubenswrapper[4727]: I1001 12:49:15.612664 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rtkfs" event={"ID":"5361bcf2-5da0-41fa-8c27-3507e59217f9","Type":"ContainerStarted","Data":"ff3e70aa1850a934adfff1ae4099e26afa5919d847f62f7ecda749ba1bef927c"} Oct 01 12:49:15 crc kubenswrapper[4727]: I1001 12:49:15.612687 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rtkfs" event={"ID":"5361bcf2-5da0-41fa-8c27-3507e59217f9","Type":"ContainerStarted","Data":"6cc5330c549fdc8956b9c4eb341e3602752cd6d0dac71fd11708038bd87c77ac"} Oct 01 12:49:16 crc kubenswrapper[4727]: I1001 12:49:16.624802 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rtkfs" event={"ID":"5361bcf2-5da0-41fa-8c27-3507e59217f9","Type":"ContainerStarted","Data":"17596f8c0b8cd8296b7d487f9619a4e08e919323cb91962c653260d1b85ea9d1"} Oct 01 12:49:16 crc kubenswrapper[4727]: I1001 12:49:16.624845 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rtkfs" event={"ID":"5361bcf2-5da0-41fa-8c27-3507e59217f9","Type":"ContainerStarted","Data":"221ed20a3c4e0e85462fce5bc9f93c3ff50940121819e688e57ac8e8d2e9fae4"} Oct 01 12:49:18 crc kubenswrapper[4727]: I1001 12:49:18.644539 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rtkfs" event={"ID":"5361bcf2-5da0-41fa-8c27-3507e59217f9","Type":"ContainerStarted","Data":"c84ae574725e17028341b8ce249253611fc8f0f339128ddf5a641bb7d92d12b2"} Oct 01 12:49:18 crc kubenswrapper[4727]: I1001 12:49:18.644963 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-rtkfs" Oct 01 12:49:18 crc kubenswrapper[4727]: I1001 12:49:18.668269 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-rtkfs" podStartSLOduration=8.975100824 podStartE2EDuration="19.66824354s" podCreationTimestamp="2025-10-01 12:48:59 +0000 UTC" firstStartedPulling="2025-10-01 12:49:00.857833703 +0000 UTC m=+719.179188540" lastFinishedPulling="2025-10-01 12:49:11.550976419 +0000 UTC m=+729.872331256" observedRunningTime="2025-10-01 12:49:18.66730439 +0000 UTC m=+736.988659227" watchObservedRunningTime="2025-10-01 12:49:18.66824354 +0000 UTC m=+736.989598377" Oct 01 12:49:20 crc kubenswrapper[4727]: I1001 12:49:20.336502 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5d688f5ffc-w2n4b" Oct 01 12:49:20 crc kubenswrapper[4727]: I1001 12:49:20.520686 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-rtkfs" Oct 01 12:49:20 crc kubenswrapper[4727]: I1001 12:49:20.578645 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-rtkfs" Oct 01 12:49:21 crc kubenswrapper[4727]: I1001 12:49:21.785262 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-p277d" Oct 01 12:49:24 crc kubenswrapper[4727]: I1001 12:49:24.869072 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-b9qrr"] Oct 01 12:49:24 crc kubenswrapper[4727]: I1001 12:49:24.870395 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-b9qrr" Oct 01 12:49:24 crc kubenswrapper[4727]: I1001 12:49:24.879770 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 01 12:49:24 crc kubenswrapper[4727]: I1001 12:49:24.879784 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-j4gp6" Oct 01 12:49:24 crc kubenswrapper[4727]: I1001 12:49:24.884216 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 01 12:49:24 crc kubenswrapper[4727]: I1001 12:49:24.895325 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-b9qrr"] Oct 01 12:49:25 crc kubenswrapper[4727]: I1001 12:49:25.029509 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5ml5\" (UniqueName: \"kubernetes.io/projected/095d3a10-d7e4-4e02-907d-359eb1603abe-kube-api-access-z5ml5\") pod \"openstack-operator-index-b9qrr\" (UID: \"095d3a10-d7e4-4e02-907d-359eb1603abe\") " pod="openstack-operators/openstack-operator-index-b9qrr" Oct 01 12:49:25 crc kubenswrapper[4727]: I1001 12:49:25.131421 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5ml5\" (UniqueName: \"kubernetes.io/projected/095d3a10-d7e4-4e02-907d-359eb1603abe-kube-api-access-z5ml5\") pod \"openstack-operator-index-b9qrr\" (UID: \"095d3a10-d7e4-4e02-907d-359eb1603abe\") " pod="openstack-operators/openstack-operator-index-b9qrr" Oct 01 12:49:25 crc kubenswrapper[4727]: I1001 12:49:25.155127 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5ml5\" (UniqueName: \"kubernetes.io/projected/095d3a10-d7e4-4e02-907d-359eb1603abe-kube-api-access-z5ml5\") pod \"openstack-operator-index-b9qrr\" (UID: \"095d3a10-d7e4-4e02-907d-359eb1603abe\") " pod="openstack-operators/openstack-operator-index-b9qrr" Oct 01 12:49:25 crc kubenswrapper[4727]: I1001 12:49:25.192214 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-b9qrr" Oct 01 12:49:25 crc kubenswrapper[4727]: I1001 12:49:25.610993 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-b9qrr"] Oct 01 12:49:25 crc kubenswrapper[4727]: I1001 12:49:25.688444 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-b9qrr" event={"ID":"095d3a10-d7e4-4e02-907d-359eb1603abe","Type":"ContainerStarted","Data":"eee66639274b30b16bc328394fa30a2aaafd542abeb17a3425671bed42024737"} Oct 01 12:49:28 crc kubenswrapper[4727]: I1001 12:49:28.227047 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-b9qrr"] Oct 01 12:49:28 crc kubenswrapper[4727]: I1001 12:49:28.836278 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-nlvhp"] Oct 01 12:49:28 crc kubenswrapper[4727]: I1001 12:49:28.837253 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nlvhp" Oct 01 12:49:28 crc kubenswrapper[4727]: I1001 12:49:28.844843 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nlvhp"] Oct 01 12:49:28 crc kubenswrapper[4727]: I1001 12:49:28.883059 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwnlj\" (UniqueName: \"kubernetes.io/projected/ea50a03d-3d5c-4e61-9703-6a8980e33a1f-kube-api-access-jwnlj\") pod \"openstack-operator-index-nlvhp\" (UID: \"ea50a03d-3d5c-4e61-9703-6a8980e33a1f\") " pod="openstack-operators/openstack-operator-index-nlvhp" Oct 01 12:49:28 crc kubenswrapper[4727]: I1001 12:49:28.984046 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwnlj\" (UniqueName: \"kubernetes.io/projected/ea50a03d-3d5c-4e61-9703-6a8980e33a1f-kube-api-access-jwnlj\") pod \"openstack-operator-index-nlvhp\" (UID: \"ea50a03d-3d5c-4e61-9703-6a8980e33a1f\") " pod="openstack-operators/openstack-operator-index-nlvhp" Oct 01 12:49:29 crc kubenswrapper[4727]: I1001 12:49:29.003217 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwnlj\" (UniqueName: \"kubernetes.io/projected/ea50a03d-3d5c-4e61-9703-6a8980e33a1f-kube-api-access-jwnlj\") pod \"openstack-operator-index-nlvhp\" (UID: \"ea50a03d-3d5c-4e61-9703-6a8980e33a1f\") " pod="openstack-operators/openstack-operator-index-nlvhp" Oct 01 12:49:29 crc kubenswrapper[4727]: I1001 12:49:29.158473 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nlvhp" Oct 01 12:49:29 crc kubenswrapper[4727]: I1001 12:49:29.585382 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nlvhp"] Oct 01 12:49:29 crc kubenswrapper[4727]: W1001 12:49:29.597245 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea50a03d_3d5c_4e61_9703_6a8980e33a1f.slice/crio-c230705b01206a87e16cb76ee54f6b6e6be7da3d9a7d91e91b7444128b5d9a53 WatchSource:0}: Error finding container c230705b01206a87e16cb76ee54f6b6e6be7da3d9a7d91e91b7444128b5d9a53: Status 404 returned error can't find the container with id c230705b01206a87e16cb76ee54f6b6e6be7da3d9a7d91e91b7444128b5d9a53 Oct 01 12:49:29 crc kubenswrapper[4727]: I1001 12:49:29.723824 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nlvhp" event={"ID":"ea50a03d-3d5c-4e61-9703-6a8980e33a1f","Type":"ContainerStarted","Data":"c230705b01206a87e16cb76ee54f6b6e6be7da3d9a7d91e91b7444128b5d9a53"} Oct 01 12:49:29 crc kubenswrapper[4727]: I1001 12:49:29.726387 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-b9qrr" event={"ID":"095d3a10-d7e4-4e02-907d-359eb1603abe","Type":"ContainerStarted","Data":"87c9b8637c93c777ade68d53d2260c7b3918759d72ddc882a4200c351c396c09"} Oct 01 12:49:29 crc kubenswrapper[4727]: I1001 12:49:29.726484 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-b9qrr" podUID="095d3a10-d7e4-4e02-907d-359eb1603abe" containerName="registry-server" containerID="cri-o://87c9b8637c93c777ade68d53d2260c7b3918759d72ddc882a4200c351c396c09" gracePeriod=2 Oct 01 12:49:29 crc kubenswrapper[4727]: I1001 12:49:29.744482 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-b9qrr" podStartSLOduration=2.590077415 podStartE2EDuration="5.744463051s" podCreationTimestamp="2025-10-01 12:49:24 +0000 UTC" firstStartedPulling="2025-10-01 12:49:25.618403636 +0000 UTC m=+743.939758473" lastFinishedPulling="2025-10-01 12:49:28.772789272 +0000 UTC m=+747.094144109" observedRunningTime="2025-10-01 12:49:29.740587099 +0000 UTC m=+748.061941946" watchObservedRunningTime="2025-10-01 12:49:29.744463051 +0000 UTC m=+748.065817888" Oct 01 12:49:29 crc kubenswrapper[4727]: I1001 12:49:29.751336 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zhq2q"] Oct 01 12:49:29 crc kubenswrapper[4727]: I1001 12:49:29.751594 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" podUID="4d871c42-cfe9-4f9d-80b3-2ccef1246050" containerName="controller-manager" containerID="cri-o://f183f45567835122838b4260c0f36387342d315dfd8d6453b0e6c438c50145d6" gracePeriod=30 Oct 01 12:49:29 crc kubenswrapper[4727]: I1001 12:49:29.852482 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn"] Oct 01 12:49:29 crc kubenswrapper[4727]: I1001 12:49:29.853088 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn" podUID="beaa2552-3c16-4136-99e5-50e5eb116f04" containerName="route-controller-manager" containerID="cri-o://ff7d2574f639b4f4a3d18781a3cc7d635d533c4202efd9fc9758c28daa1ff0a2" gracePeriod=30 Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.162609 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-b9qrr" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.192055 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.223831 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4nbzc" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.268825 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.301516 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d871c42-cfe9-4f9d-80b3-2ccef1246050-serving-cert\") pod \"4d871c42-cfe9-4f9d-80b3-2ccef1246050\" (UID: \"4d871c42-cfe9-4f9d-80b3-2ccef1246050\") " Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.301563 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d871c42-cfe9-4f9d-80b3-2ccef1246050-config\") pod \"4d871c42-cfe9-4f9d-80b3-2ccef1246050\" (UID: \"4d871c42-cfe9-4f9d-80b3-2ccef1246050\") " Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.301597 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5ml5\" (UniqueName: \"kubernetes.io/projected/095d3a10-d7e4-4e02-907d-359eb1603abe-kube-api-access-z5ml5\") pod \"095d3a10-d7e4-4e02-907d-359eb1603abe\" (UID: \"095d3a10-d7e4-4e02-907d-359eb1603abe\") " Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.301629 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4qzn\" (UniqueName: \"kubernetes.io/projected/4d871c42-cfe9-4f9d-80b3-2ccef1246050-kube-api-access-q4qzn\") pod \"4d871c42-cfe9-4f9d-80b3-2ccef1246050\" (UID: \"4d871c42-cfe9-4f9d-80b3-2ccef1246050\") " Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.301649 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d871c42-cfe9-4f9d-80b3-2ccef1246050-client-ca\") pod \"4d871c42-cfe9-4f9d-80b3-2ccef1246050\" (UID: \"4d871c42-cfe9-4f9d-80b3-2ccef1246050\") " Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.301672 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4d871c42-cfe9-4f9d-80b3-2ccef1246050-proxy-ca-bundles\") pod \"4d871c42-cfe9-4f9d-80b3-2ccef1246050\" (UID: \"4d871c42-cfe9-4f9d-80b3-2ccef1246050\") " Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.302607 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d871c42-cfe9-4f9d-80b3-2ccef1246050-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4d871c42-cfe9-4f9d-80b3-2ccef1246050" (UID: "4d871c42-cfe9-4f9d-80b3-2ccef1246050"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.303644 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d871c42-cfe9-4f9d-80b3-2ccef1246050-client-ca" (OuterVolumeSpecName: "client-ca") pod "4d871c42-cfe9-4f9d-80b3-2ccef1246050" (UID: "4d871c42-cfe9-4f9d-80b3-2ccef1246050"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.303740 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d871c42-cfe9-4f9d-80b3-2ccef1246050-config" (OuterVolumeSpecName: "config") pod "4d871c42-cfe9-4f9d-80b3-2ccef1246050" (UID: "4d871c42-cfe9-4f9d-80b3-2ccef1246050"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.308646 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d871c42-cfe9-4f9d-80b3-2ccef1246050-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4d871c42-cfe9-4f9d-80b3-2ccef1246050" (UID: "4d871c42-cfe9-4f9d-80b3-2ccef1246050"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.308714 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/095d3a10-d7e4-4e02-907d-359eb1603abe-kube-api-access-z5ml5" (OuterVolumeSpecName: "kube-api-access-z5ml5") pod "095d3a10-d7e4-4e02-907d-359eb1603abe" (UID: "095d3a10-d7e4-4e02-907d-359eb1603abe"). InnerVolumeSpecName "kube-api-access-z5ml5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.309067 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d871c42-cfe9-4f9d-80b3-2ccef1246050-kube-api-access-q4qzn" (OuterVolumeSpecName: "kube-api-access-q4qzn") pod "4d871c42-cfe9-4f9d-80b3-2ccef1246050" (UID: "4d871c42-cfe9-4f9d-80b3-2ccef1246050"). InnerVolumeSpecName "kube-api-access-q4qzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.402902 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/beaa2552-3c16-4136-99e5-50e5eb116f04-client-ca\") pod \"beaa2552-3c16-4136-99e5-50e5eb116f04\" (UID: \"beaa2552-3c16-4136-99e5-50e5eb116f04\") " Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.403023 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/beaa2552-3c16-4136-99e5-50e5eb116f04-serving-cert\") pod \"beaa2552-3c16-4136-99e5-50e5eb116f04\" (UID: \"beaa2552-3c16-4136-99e5-50e5eb116f04\") " Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.403060 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94nj2\" (UniqueName: \"kubernetes.io/projected/beaa2552-3c16-4136-99e5-50e5eb116f04-kube-api-access-94nj2\") pod \"beaa2552-3c16-4136-99e5-50e5eb116f04\" (UID: \"beaa2552-3c16-4136-99e5-50e5eb116f04\") " Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.403091 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beaa2552-3c16-4136-99e5-50e5eb116f04-config\") pod \"beaa2552-3c16-4136-99e5-50e5eb116f04\" (UID: \"beaa2552-3c16-4136-99e5-50e5eb116f04\") " Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.403859 4727 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d871c42-cfe9-4f9d-80b3-2ccef1246050-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.403914 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d871c42-cfe9-4f9d-80b3-2ccef1246050-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.403924 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5ml5\" (UniqueName: \"kubernetes.io/projected/095d3a10-d7e4-4e02-907d-359eb1603abe-kube-api-access-z5ml5\") on node \"crc\" DevicePath \"\"" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.403936 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4qzn\" (UniqueName: \"kubernetes.io/projected/4d871c42-cfe9-4f9d-80b3-2ccef1246050-kube-api-access-q4qzn\") on node \"crc\" DevicePath \"\"" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.403946 4727 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d871c42-cfe9-4f9d-80b3-2ccef1246050-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.403973 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/beaa2552-3c16-4136-99e5-50e5eb116f04-client-ca" (OuterVolumeSpecName: "client-ca") pod "beaa2552-3c16-4136-99e5-50e5eb116f04" (UID: "beaa2552-3c16-4136-99e5-50e5eb116f04"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.403994 4727 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4d871c42-cfe9-4f9d-80b3-2ccef1246050-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.403959 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/beaa2552-3c16-4136-99e5-50e5eb116f04-config" (OuterVolumeSpecName: "config") pod "beaa2552-3c16-4136-99e5-50e5eb116f04" (UID: "beaa2552-3c16-4136-99e5-50e5eb116f04"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.418356 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beaa2552-3c16-4136-99e5-50e5eb116f04-kube-api-access-94nj2" (OuterVolumeSpecName: "kube-api-access-94nj2") pod "beaa2552-3c16-4136-99e5-50e5eb116f04" (UID: "beaa2552-3c16-4136-99e5-50e5eb116f04"). InnerVolumeSpecName "kube-api-access-94nj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.431142 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beaa2552-3c16-4136-99e5-50e5eb116f04-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "beaa2552-3c16-4136-99e5-50e5eb116f04" (UID: "beaa2552-3c16-4136-99e5-50e5eb116f04"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.505759 4727 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/beaa2552-3c16-4136-99e5-50e5eb116f04-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.505797 4727 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/beaa2552-3c16-4136-99e5-50e5eb116f04-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.505810 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94nj2\" (UniqueName: \"kubernetes.io/projected/beaa2552-3c16-4136-99e5-50e5eb116f04-kube-api-access-94nj2\") on node \"crc\" DevicePath \"\"" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.505821 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beaa2552-3c16-4136-99e5-50e5eb116f04-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.526072 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-rtkfs" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.735136 4727 generic.go:334] "Generic (PLEG): container finished" podID="beaa2552-3c16-4136-99e5-50e5eb116f04" containerID="ff7d2574f639b4f4a3d18781a3cc7d635d533c4202efd9fc9758c28daa1ff0a2" exitCode=0 Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.735248 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.735371 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn" event={"ID":"beaa2552-3c16-4136-99e5-50e5eb116f04","Type":"ContainerDied","Data":"ff7d2574f639b4f4a3d18781a3cc7d635d533c4202efd9fc9758c28daa1ff0a2"} Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.735432 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn" event={"ID":"beaa2552-3c16-4136-99e5-50e5eb116f04","Type":"ContainerDied","Data":"611aed8fb83ef99febeee40833bc0d52084a812585aa8f86088202b1feb1ceae"} Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.735455 4727 scope.go:117] "RemoveContainer" containerID="ff7d2574f639b4f4a3d18781a3cc7d635d533c4202efd9fc9758c28daa1ff0a2" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.737685 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nlvhp" event={"ID":"ea50a03d-3d5c-4e61-9703-6a8980e33a1f","Type":"ContainerStarted","Data":"58dbdecbad1bc39f550ca3609272cf7ac79b990e95ea436bf68b469ba4d1dfe7"} Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.742635 4727 generic.go:334] "Generic (PLEG): container finished" podID="095d3a10-d7e4-4e02-907d-359eb1603abe" containerID="87c9b8637c93c777ade68d53d2260c7b3918759d72ddc882a4200c351c396c09" exitCode=0 Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.742739 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-b9qrr" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.743110 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-b9qrr" event={"ID":"095d3a10-d7e4-4e02-907d-359eb1603abe","Type":"ContainerDied","Data":"87c9b8637c93c777ade68d53d2260c7b3918759d72ddc882a4200c351c396c09"} Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.743154 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-b9qrr" event={"ID":"095d3a10-d7e4-4e02-907d-359eb1603abe","Type":"ContainerDied","Data":"eee66639274b30b16bc328394fa30a2aaafd542abeb17a3425671bed42024737"} Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.745128 4727 generic.go:334] "Generic (PLEG): container finished" podID="4d871c42-cfe9-4f9d-80b3-2ccef1246050" containerID="f183f45567835122838b4260c0f36387342d315dfd8d6453b0e6c438c50145d6" exitCode=0 Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.745218 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" event={"ID":"4d871c42-cfe9-4f9d-80b3-2ccef1246050","Type":"ContainerDied","Data":"f183f45567835122838b4260c0f36387342d315dfd8d6453b0e6c438c50145d6"} Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.745248 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.745274 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zhq2q" event={"ID":"4d871c42-cfe9-4f9d-80b3-2ccef1246050","Type":"ContainerDied","Data":"94b659bd9f3453ee7ae3cf5e0efd459c8fb9d9e901a9ecf4150fd5856b01ac3a"} Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.761618 4727 scope.go:117] "RemoveContainer" containerID="ff7d2574f639b4f4a3d18781a3cc7d635d533c4202efd9fc9758c28daa1ff0a2" Oct 01 12:49:30 crc kubenswrapper[4727]: E1001 12:49:30.762838 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff7d2574f639b4f4a3d18781a3cc7d635d533c4202efd9fc9758c28daa1ff0a2\": container with ID starting with ff7d2574f639b4f4a3d18781a3cc7d635d533c4202efd9fc9758c28daa1ff0a2 not found: ID does not exist" containerID="ff7d2574f639b4f4a3d18781a3cc7d635d533c4202efd9fc9758c28daa1ff0a2" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.762876 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff7d2574f639b4f4a3d18781a3cc7d635d533c4202efd9fc9758c28daa1ff0a2"} err="failed to get container status \"ff7d2574f639b4f4a3d18781a3cc7d635d533c4202efd9fc9758c28daa1ff0a2\": rpc error: code = NotFound desc = could not find container \"ff7d2574f639b4f4a3d18781a3cc7d635d533c4202efd9fc9758c28daa1ff0a2\": container with ID starting with ff7d2574f639b4f4a3d18781a3cc7d635d533c4202efd9fc9758c28daa1ff0a2 not found: ID does not exist" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.762907 4727 scope.go:117] "RemoveContainer" containerID="87c9b8637c93c777ade68d53d2260c7b3918759d72ddc882a4200c351c396c09" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.765934 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-nlvhp" podStartSLOduration=2.6883872589999998 podStartE2EDuration="2.76590459s" podCreationTimestamp="2025-10-01 12:49:28 +0000 UTC" firstStartedPulling="2025-10-01 12:49:29.599991025 +0000 UTC m=+747.921345862" lastFinishedPulling="2025-10-01 12:49:29.677508356 +0000 UTC m=+747.998863193" observedRunningTime="2025-10-01 12:49:30.762628176 +0000 UTC m=+749.083983033" watchObservedRunningTime="2025-10-01 12:49:30.76590459 +0000 UTC m=+749.087259427" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.783761 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zhq2q"] Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.788731 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zhq2q"] Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.789207 4727 scope.go:117] "RemoveContainer" containerID="87c9b8637c93c777ade68d53d2260c7b3918759d72ddc882a4200c351c396c09" Oct 01 12:49:30 crc kubenswrapper[4727]: E1001 12:49:30.789897 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87c9b8637c93c777ade68d53d2260c7b3918759d72ddc882a4200c351c396c09\": container with ID starting with 87c9b8637c93c777ade68d53d2260c7b3918759d72ddc882a4200c351c396c09 not found: ID does not exist" containerID="87c9b8637c93c777ade68d53d2260c7b3918759d72ddc882a4200c351c396c09" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.790027 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87c9b8637c93c777ade68d53d2260c7b3918759d72ddc882a4200c351c396c09"} err="failed to get container status \"87c9b8637c93c777ade68d53d2260c7b3918759d72ddc882a4200c351c396c09\": rpc error: code = NotFound desc = could not find container \"87c9b8637c93c777ade68d53d2260c7b3918759d72ddc882a4200c351c396c09\": container with ID starting with 87c9b8637c93c777ade68d53d2260c7b3918759d72ddc882a4200c351c396c09 not found: ID does not exist" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.790127 4727 scope.go:117] "RemoveContainer" containerID="f183f45567835122838b4260c0f36387342d315dfd8d6453b0e6c438c50145d6" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.801971 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-b9qrr"] Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.806600 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-b9qrr"] Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.812276 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn"] Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.817516 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w8gvn"] Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.824237 4727 scope.go:117] "RemoveContainer" containerID="f183f45567835122838b4260c0f36387342d315dfd8d6453b0e6c438c50145d6" Oct 01 12:49:30 crc kubenswrapper[4727]: E1001 12:49:30.824841 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f183f45567835122838b4260c0f36387342d315dfd8d6453b0e6c438c50145d6\": container with ID starting with f183f45567835122838b4260c0f36387342d315dfd8d6453b0e6c438c50145d6 not found: ID does not exist" containerID="f183f45567835122838b4260c0f36387342d315dfd8d6453b0e6c438c50145d6" Oct 01 12:49:30 crc kubenswrapper[4727]: I1001 12:49:30.824871 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f183f45567835122838b4260c0f36387342d315dfd8d6453b0e6c438c50145d6"} err="failed to get container status \"f183f45567835122838b4260c0f36387342d315dfd8d6453b0e6c438c50145d6\": rpc error: code = NotFound desc = could not find container \"f183f45567835122838b4260c0f36387342d315dfd8d6453b0e6c438c50145d6\": container with ID starting with f183f45567835122838b4260c0f36387342d315dfd8d6453b0e6c438c50145d6 not found: ID does not exist" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.190079 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8b9b4f67-vkdjx"] Oct 01 12:49:31 crc kubenswrapper[4727]: E1001 12:49:31.190360 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d871c42-cfe9-4f9d-80b3-2ccef1246050" containerName="controller-manager" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.190378 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d871c42-cfe9-4f9d-80b3-2ccef1246050" containerName="controller-manager" Oct 01 12:49:31 crc kubenswrapper[4727]: E1001 12:49:31.190402 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095d3a10-d7e4-4e02-907d-359eb1603abe" containerName="registry-server" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.190411 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="095d3a10-d7e4-4e02-907d-359eb1603abe" containerName="registry-server" Oct 01 12:49:31 crc kubenswrapper[4727]: E1001 12:49:31.190418 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beaa2552-3c16-4136-99e5-50e5eb116f04" containerName="route-controller-manager" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.190425 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="beaa2552-3c16-4136-99e5-50e5eb116f04" containerName="route-controller-manager" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.190545 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="beaa2552-3c16-4136-99e5-50e5eb116f04" containerName="route-controller-manager" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.190560 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d871c42-cfe9-4f9d-80b3-2ccef1246050" containerName="controller-manager" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.190569 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="095d3a10-d7e4-4e02-907d-359eb1603abe" containerName="registry-server" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.191041 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d8b9b4f67-vkdjx" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.193185 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.193427 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.193599 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.193747 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.193884 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.197402 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6b94f6cfd4-wtmzq"] Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.198499 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.199878 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b94f6cfd4-wtmzq" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.201694 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.202213 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.202293 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.202602 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.202919 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.203951 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b94f6cfd4-wtmzq"] Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.205733 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.210415 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.210980 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8b9b4f67-vkdjx"] Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.316059 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjkp7\" (UniqueName: \"kubernetes.io/projected/d0828202-d83c-477e-abd7-528d1e1b63d1-kube-api-access-gjkp7\") pod \"controller-manager-6b94f6cfd4-wtmzq\" (UID: \"d0828202-d83c-477e-abd7-528d1e1b63d1\") " pod="openshift-controller-manager/controller-manager-6b94f6cfd4-wtmzq" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.316107 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb48z\" (UniqueName: \"kubernetes.io/projected/b7db9177-6e48-458c-911c-579033c250b5-kube-api-access-rb48z\") pod \"route-controller-manager-6d8b9b4f67-vkdjx\" (UID: \"b7db9177-6e48-458c-911c-579033c250b5\") " pod="openshift-route-controller-manager/route-controller-manager-6d8b9b4f67-vkdjx" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.316163 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0828202-d83c-477e-abd7-528d1e1b63d1-config\") pod \"controller-manager-6b94f6cfd4-wtmzq\" (UID: \"d0828202-d83c-477e-abd7-528d1e1b63d1\") " pod="openshift-controller-manager/controller-manager-6b94f6cfd4-wtmzq" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.316199 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0828202-d83c-477e-abd7-528d1e1b63d1-serving-cert\") pod \"controller-manager-6b94f6cfd4-wtmzq\" (UID: \"d0828202-d83c-477e-abd7-528d1e1b63d1\") " pod="openshift-controller-manager/controller-manager-6b94f6cfd4-wtmzq" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.316251 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0828202-d83c-477e-abd7-528d1e1b63d1-client-ca\") pod \"controller-manager-6b94f6cfd4-wtmzq\" (UID: \"d0828202-d83c-477e-abd7-528d1e1b63d1\") " pod="openshift-controller-manager/controller-manager-6b94f6cfd4-wtmzq" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.316284 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7db9177-6e48-458c-911c-579033c250b5-serving-cert\") pod \"route-controller-manager-6d8b9b4f67-vkdjx\" (UID: \"b7db9177-6e48-458c-911c-579033c250b5\") " pod="openshift-route-controller-manager/route-controller-manager-6d8b9b4f67-vkdjx" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.316308 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d0828202-d83c-477e-abd7-528d1e1b63d1-proxy-ca-bundles\") pod \"controller-manager-6b94f6cfd4-wtmzq\" (UID: \"d0828202-d83c-477e-abd7-528d1e1b63d1\") " pod="openshift-controller-manager/controller-manager-6b94f6cfd4-wtmzq" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.316330 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b7db9177-6e48-458c-911c-579033c250b5-client-ca\") pod \"route-controller-manager-6d8b9b4f67-vkdjx\" (UID: \"b7db9177-6e48-458c-911c-579033c250b5\") " pod="openshift-route-controller-manager/route-controller-manager-6d8b9b4f67-vkdjx" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.316475 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7db9177-6e48-458c-911c-579033c250b5-config\") pod \"route-controller-manager-6d8b9b4f67-vkdjx\" (UID: \"b7db9177-6e48-458c-911c-579033c250b5\") " pod="openshift-route-controller-manager/route-controller-manager-6d8b9b4f67-vkdjx" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.418258 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0828202-d83c-477e-abd7-528d1e1b63d1-client-ca\") pod \"controller-manager-6b94f6cfd4-wtmzq\" (UID: \"d0828202-d83c-477e-abd7-528d1e1b63d1\") " pod="openshift-controller-manager/controller-manager-6b94f6cfd4-wtmzq" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.418722 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7db9177-6e48-458c-911c-579033c250b5-serving-cert\") pod \"route-controller-manager-6d8b9b4f67-vkdjx\" (UID: \"b7db9177-6e48-458c-911c-579033c250b5\") " pod="openshift-route-controller-manager/route-controller-manager-6d8b9b4f67-vkdjx" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.418834 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d0828202-d83c-477e-abd7-528d1e1b63d1-proxy-ca-bundles\") pod \"controller-manager-6b94f6cfd4-wtmzq\" (UID: \"d0828202-d83c-477e-abd7-528d1e1b63d1\") " pod="openshift-controller-manager/controller-manager-6b94f6cfd4-wtmzq" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.418872 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b7db9177-6e48-458c-911c-579033c250b5-client-ca\") pod \"route-controller-manager-6d8b9b4f67-vkdjx\" (UID: \"b7db9177-6e48-458c-911c-579033c250b5\") " pod="openshift-route-controller-manager/route-controller-manager-6d8b9b4f67-vkdjx" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.418938 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7db9177-6e48-458c-911c-579033c250b5-config\") pod \"route-controller-manager-6d8b9b4f67-vkdjx\" (UID: \"b7db9177-6e48-458c-911c-579033c250b5\") " pod="openshift-route-controller-manager/route-controller-manager-6d8b9b4f67-vkdjx" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.419041 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjkp7\" (UniqueName: \"kubernetes.io/projected/d0828202-d83c-477e-abd7-528d1e1b63d1-kube-api-access-gjkp7\") pod \"controller-manager-6b94f6cfd4-wtmzq\" (UID: \"d0828202-d83c-477e-abd7-528d1e1b63d1\") " pod="openshift-controller-manager/controller-manager-6b94f6cfd4-wtmzq" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.419091 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb48z\" (UniqueName: \"kubernetes.io/projected/b7db9177-6e48-458c-911c-579033c250b5-kube-api-access-rb48z\") pod \"route-controller-manager-6d8b9b4f67-vkdjx\" (UID: \"b7db9177-6e48-458c-911c-579033c250b5\") " pod="openshift-route-controller-manager/route-controller-manager-6d8b9b4f67-vkdjx" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.419145 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0828202-d83c-477e-abd7-528d1e1b63d1-config\") pod \"controller-manager-6b94f6cfd4-wtmzq\" (UID: \"d0828202-d83c-477e-abd7-528d1e1b63d1\") " pod="openshift-controller-manager/controller-manager-6b94f6cfd4-wtmzq" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.419174 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0828202-d83c-477e-abd7-528d1e1b63d1-serving-cert\") pod \"controller-manager-6b94f6cfd4-wtmzq\" (UID: \"d0828202-d83c-477e-abd7-528d1e1b63d1\") " pod="openshift-controller-manager/controller-manager-6b94f6cfd4-wtmzq" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.419250 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0828202-d83c-477e-abd7-528d1e1b63d1-client-ca\") pod \"controller-manager-6b94f6cfd4-wtmzq\" (UID: \"d0828202-d83c-477e-abd7-528d1e1b63d1\") " pod="openshift-controller-manager/controller-manager-6b94f6cfd4-wtmzq" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.420919 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0828202-d83c-477e-abd7-528d1e1b63d1-config\") pod \"controller-manager-6b94f6cfd4-wtmzq\" (UID: \"d0828202-d83c-477e-abd7-528d1e1b63d1\") " pod="openshift-controller-manager/controller-manager-6b94f6cfd4-wtmzq" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.421150 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d0828202-d83c-477e-abd7-528d1e1b63d1-proxy-ca-bundles\") pod \"controller-manager-6b94f6cfd4-wtmzq\" (UID: \"d0828202-d83c-477e-abd7-528d1e1b63d1\") " pod="openshift-controller-manager/controller-manager-6b94f6cfd4-wtmzq" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.421978 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7db9177-6e48-458c-911c-579033c250b5-config\") pod \"route-controller-manager-6d8b9b4f67-vkdjx\" (UID: \"b7db9177-6e48-458c-911c-579033c250b5\") " pod="openshift-route-controller-manager/route-controller-manager-6d8b9b4f67-vkdjx" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.423374 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b7db9177-6e48-458c-911c-579033c250b5-client-ca\") pod \"route-controller-manager-6d8b9b4f67-vkdjx\" (UID: \"b7db9177-6e48-458c-911c-579033c250b5\") " pod="openshift-route-controller-manager/route-controller-manager-6d8b9b4f67-vkdjx" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.425773 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7db9177-6e48-458c-911c-579033c250b5-serving-cert\") pod \"route-controller-manager-6d8b9b4f67-vkdjx\" (UID: \"b7db9177-6e48-458c-911c-579033c250b5\") " pod="openshift-route-controller-manager/route-controller-manager-6d8b9b4f67-vkdjx" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.426038 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0828202-d83c-477e-abd7-528d1e1b63d1-serving-cert\") pod \"controller-manager-6b94f6cfd4-wtmzq\" (UID: \"d0828202-d83c-477e-abd7-528d1e1b63d1\") " pod="openshift-controller-manager/controller-manager-6b94f6cfd4-wtmzq" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.437611 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjkp7\" (UniqueName: \"kubernetes.io/projected/d0828202-d83c-477e-abd7-528d1e1b63d1-kube-api-access-gjkp7\") pod \"controller-manager-6b94f6cfd4-wtmzq\" (UID: \"d0828202-d83c-477e-abd7-528d1e1b63d1\") " pod="openshift-controller-manager/controller-manager-6b94f6cfd4-wtmzq" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.438659 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb48z\" (UniqueName: \"kubernetes.io/projected/b7db9177-6e48-458c-911c-579033c250b5-kube-api-access-rb48z\") pod \"route-controller-manager-6d8b9b4f67-vkdjx\" (UID: \"b7db9177-6e48-458c-911c-579033c250b5\") " pod="openshift-route-controller-manager/route-controller-manager-6d8b9b4f67-vkdjx" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.515811 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d8b9b4f67-vkdjx" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.536036 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b94f6cfd4-wtmzq" Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.771232 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b94f6cfd4-wtmzq"] Oct 01 12:49:31 crc kubenswrapper[4727]: I1001 12:49:31.932648 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8b9b4f67-vkdjx"] Oct 01 12:49:32 crc kubenswrapper[4727]: I1001 12:49:32.380156 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="095d3a10-d7e4-4e02-907d-359eb1603abe" path="/var/lib/kubelet/pods/095d3a10-d7e4-4e02-907d-359eb1603abe/volumes" Oct 01 12:49:32 crc kubenswrapper[4727]: I1001 12:49:32.381101 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d871c42-cfe9-4f9d-80b3-2ccef1246050" path="/var/lib/kubelet/pods/4d871c42-cfe9-4f9d-80b3-2ccef1246050/volumes" Oct 01 12:49:32 crc kubenswrapper[4727]: I1001 12:49:32.381825 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beaa2552-3c16-4136-99e5-50e5eb116f04" path="/var/lib/kubelet/pods/beaa2552-3c16-4136-99e5-50e5eb116f04/volumes" Oct 01 12:49:32 crc kubenswrapper[4727]: I1001 12:49:32.770071 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b94f6cfd4-wtmzq" event={"ID":"d0828202-d83c-477e-abd7-528d1e1b63d1","Type":"ContainerStarted","Data":"de8ae526ea681c4a61425c7f7b56d152f92c66de0e8b2f03310fa2ec9bf0a0d2"} Oct 01 12:49:32 crc kubenswrapper[4727]: I1001 12:49:32.770123 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b94f6cfd4-wtmzq" event={"ID":"d0828202-d83c-477e-abd7-528d1e1b63d1","Type":"ContainerStarted","Data":"0f586f6841a6011df1bcad7e7491484749fef4f533230d70a0edb31c38347fa7"} Oct 01 12:49:32 crc kubenswrapper[4727]: I1001 12:49:32.771382 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6b94f6cfd4-wtmzq" Oct 01 12:49:32 crc kubenswrapper[4727]: I1001 12:49:32.772839 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d8b9b4f67-vkdjx" event={"ID":"b7db9177-6e48-458c-911c-579033c250b5","Type":"ContainerStarted","Data":"6ab5e44d8c8d9a42c592a7484407d5f70baacd153ccc0c956fec773785fcfaea"} Oct 01 12:49:32 crc kubenswrapper[4727]: I1001 12:49:32.772883 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d8b9b4f67-vkdjx" event={"ID":"b7db9177-6e48-458c-911c-579033c250b5","Type":"ContainerStarted","Data":"1fd45d95bd04d2999764115885a8d5c8e3eafbf59cda75cf9115b249fc02a8ed"} Oct 01 12:49:32 crc kubenswrapper[4727]: I1001 12:49:32.773526 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d8b9b4f67-vkdjx" Oct 01 12:49:32 crc kubenswrapper[4727]: I1001 12:49:32.777644 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6b94f6cfd4-wtmzq" Oct 01 12:49:32 crc kubenswrapper[4727]: I1001 12:49:32.781084 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d8b9b4f67-vkdjx" Oct 01 12:49:32 crc kubenswrapper[4727]: I1001 12:49:32.792919 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6b94f6cfd4-wtmzq" podStartSLOduration=3.792896954 podStartE2EDuration="3.792896954s" podCreationTimestamp="2025-10-01 12:49:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:49:32.792355517 +0000 UTC m=+751.113710354" watchObservedRunningTime="2025-10-01 12:49:32.792896954 +0000 UTC m=+751.114251791" Oct 01 12:49:32 crc kubenswrapper[4727]: I1001 12:49:32.849966 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d8b9b4f67-vkdjx" podStartSLOduration=3.849942654 podStartE2EDuration="3.849942654s" podCreationTimestamp="2025-10-01 12:49:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:49:32.818139105 +0000 UTC m=+751.139493972" watchObservedRunningTime="2025-10-01 12:49:32.849942654 +0000 UTC m=+751.171297511" Oct 01 12:49:33 crc kubenswrapper[4727]: I1001 12:49:33.291884 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:49:33 crc kubenswrapper[4727]: I1001 12:49:33.292269 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:49:33 crc kubenswrapper[4727]: I1001 12:49:33.292385 4727 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" Oct 01 12:49:33 crc kubenswrapper[4727]: I1001 12:49:33.293607 4727 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5a5c4ca99360c9b81c10e0ced10d126f629e2db295e44de92257033d1fe6295f"} pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 12:49:33 crc kubenswrapper[4727]: I1001 12:49:33.293776 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" containerID="cri-o://5a5c4ca99360c9b81c10e0ced10d126f629e2db295e44de92257033d1fe6295f" gracePeriod=600 Oct 01 12:49:33 crc kubenswrapper[4727]: I1001 12:49:33.782517 4727 generic.go:334] "Generic (PLEG): container finished" podID="d18290ae-64a5-44a5-a704-90977d85852b" containerID="5a5c4ca99360c9b81c10e0ced10d126f629e2db295e44de92257033d1fe6295f" exitCode=0 Oct 01 12:49:33 crc kubenswrapper[4727]: I1001 12:49:33.782655 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" event={"ID":"d18290ae-64a5-44a5-a704-90977d85852b","Type":"ContainerDied","Data":"5a5c4ca99360c9b81c10e0ced10d126f629e2db295e44de92257033d1fe6295f"} Oct 01 12:49:33 crc kubenswrapper[4727]: I1001 12:49:33.783175 4727 scope.go:117] "RemoveContainer" containerID="12b4ca882a4878e5a3279395bccb6e21a4ad58217420588536e4e56fcd67eeb7" Oct 01 12:49:34 crc kubenswrapper[4727]: I1001 12:49:34.792220 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" event={"ID":"d18290ae-64a5-44a5-a704-90977d85852b","Type":"ContainerStarted","Data":"8ee5ee2e5696638af5bc213bd13dc53b7b85703a971ba03bb8cf933270c1945e"} Oct 01 12:49:38 crc kubenswrapper[4727]: I1001 12:49:38.995338 4727 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 01 12:49:39 crc kubenswrapper[4727]: I1001 12:49:39.158618 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-nlvhp" Oct 01 12:49:39 crc kubenswrapper[4727]: I1001 12:49:39.158678 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-nlvhp" Oct 01 12:49:39 crc kubenswrapper[4727]: I1001 12:49:39.188682 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-nlvhp" Oct 01 12:49:39 crc kubenswrapper[4727]: I1001 12:49:39.241010 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2pzbt"] Oct 01 12:49:39 crc kubenswrapper[4727]: I1001 12:49:39.242275 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2pzbt" Oct 01 12:49:39 crc kubenswrapper[4727]: I1001 12:49:39.254063 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2pzbt"] Oct 01 12:49:39 crc kubenswrapper[4727]: I1001 12:49:39.428089 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/985279b0-dcf3-480f-a52c-1daf9fa1a0e6-utilities\") pod \"certified-operators-2pzbt\" (UID: \"985279b0-dcf3-480f-a52c-1daf9fa1a0e6\") " pod="openshift-marketplace/certified-operators-2pzbt" Oct 01 12:49:39 crc kubenswrapper[4727]: I1001 12:49:39.428144 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/985279b0-dcf3-480f-a52c-1daf9fa1a0e6-catalog-content\") pod \"certified-operators-2pzbt\" (UID: \"985279b0-dcf3-480f-a52c-1daf9fa1a0e6\") " pod="openshift-marketplace/certified-operators-2pzbt" Oct 01 12:49:39 crc kubenswrapper[4727]: I1001 12:49:39.428200 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp28s\" (UniqueName: \"kubernetes.io/projected/985279b0-dcf3-480f-a52c-1daf9fa1a0e6-kube-api-access-fp28s\") pod \"certified-operators-2pzbt\" (UID: \"985279b0-dcf3-480f-a52c-1daf9fa1a0e6\") " pod="openshift-marketplace/certified-operators-2pzbt" Oct 01 12:49:39 crc kubenswrapper[4727]: I1001 12:49:39.529351 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp28s\" (UniqueName: \"kubernetes.io/projected/985279b0-dcf3-480f-a52c-1daf9fa1a0e6-kube-api-access-fp28s\") pod \"certified-operators-2pzbt\" (UID: \"985279b0-dcf3-480f-a52c-1daf9fa1a0e6\") " pod="openshift-marketplace/certified-operators-2pzbt" Oct 01 12:49:39 crc kubenswrapper[4727]: I1001 12:49:39.529465 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/985279b0-dcf3-480f-a52c-1daf9fa1a0e6-utilities\") pod \"certified-operators-2pzbt\" (UID: \"985279b0-dcf3-480f-a52c-1daf9fa1a0e6\") " pod="openshift-marketplace/certified-operators-2pzbt" Oct 01 12:49:39 crc kubenswrapper[4727]: I1001 12:49:39.529496 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/985279b0-dcf3-480f-a52c-1daf9fa1a0e6-catalog-content\") pod \"certified-operators-2pzbt\" (UID: \"985279b0-dcf3-480f-a52c-1daf9fa1a0e6\") " pod="openshift-marketplace/certified-operators-2pzbt" Oct 01 12:49:39 crc kubenswrapper[4727]: I1001 12:49:39.530068 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/985279b0-dcf3-480f-a52c-1daf9fa1a0e6-catalog-content\") pod \"certified-operators-2pzbt\" (UID: \"985279b0-dcf3-480f-a52c-1daf9fa1a0e6\") " pod="openshift-marketplace/certified-operators-2pzbt" Oct 01 12:49:39 crc kubenswrapper[4727]: I1001 12:49:39.530187 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/985279b0-dcf3-480f-a52c-1daf9fa1a0e6-utilities\") pod \"certified-operators-2pzbt\" (UID: \"985279b0-dcf3-480f-a52c-1daf9fa1a0e6\") " pod="openshift-marketplace/certified-operators-2pzbt" Oct 01 12:49:39 crc kubenswrapper[4727]: I1001 12:49:39.552064 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp28s\" (UniqueName: \"kubernetes.io/projected/985279b0-dcf3-480f-a52c-1daf9fa1a0e6-kube-api-access-fp28s\") pod \"certified-operators-2pzbt\" (UID: \"985279b0-dcf3-480f-a52c-1daf9fa1a0e6\") " pod="openshift-marketplace/certified-operators-2pzbt" Oct 01 12:49:39 crc kubenswrapper[4727]: I1001 12:49:39.569981 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2pzbt" Oct 01 12:49:39 crc kubenswrapper[4727]: I1001 12:49:39.858766 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-nlvhp" Oct 01 12:49:40 crc kubenswrapper[4727]: I1001 12:49:40.066197 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2pzbt"] Oct 01 12:49:40 crc kubenswrapper[4727]: I1001 12:49:40.830507 4727 generic.go:334] "Generic (PLEG): container finished" podID="985279b0-dcf3-480f-a52c-1daf9fa1a0e6" containerID="d92470ead1eaa04f21da89dc4af93352411023419a54a678439bbf4dc2f8991c" exitCode=0 Oct 01 12:49:40 crc kubenswrapper[4727]: I1001 12:49:40.830589 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2pzbt" event={"ID":"985279b0-dcf3-480f-a52c-1daf9fa1a0e6","Type":"ContainerDied","Data":"d92470ead1eaa04f21da89dc4af93352411023419a54a678439bbf4dc2f8991c"} Oct 01 12:49:40 crc kubenswrapper[4727]: I1001 12:49:40.830897 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2pzbt" event={"ID":"985279b0-dcf3-480f-a52c-1daf9fa1a0e6","Type":"ContainerStarted","Data":"bbcf8e73216a9603ec3c1f3d2296864e5d4e18f443c28897ccfe944b37a14c3b"} Oct 01 12:49:42 crc kubenswrapper[4727]: I1001 12:49:42.854252 4727 generic.go:334] "Generic (PLEG): container finished" podID="985279b0-dcf3-480f-a52c-1daf9fa1a0e6" containerID="006c724c60b3d1a798f87ab25effec8eee265cc78128ee617fba1480c0bab9e1" exitCode=0 Oct 01 12:49:42 crc kubenswrapper[4727]: I1001 12:49:42.854354 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2pzbt" event={"ID":"985279b0-dcf3-480f-a52c-1daf9fa1a0e6","Type":"ContainerDied","Data":"006c724c60b3d1a798f87ab25effec8eee265cc78128ee617fba1480c0bab9e1"} Oct 01 12:49:43 crc kubenswrapper[4727]: I1001 12:49:43.863535 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2pzbt" event={"ID":"985279b0-dcf3-480f-a52c-1daf9fa1a0e6","Type":"ContainerStarted","Data":"c01df7330b0e0356a65127655527479040a637f734da13dbac4accb3d750d27d"} Oct 01 12:49:43 crc kubenswrapper[4727]: I1001 12:49:43.884885 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2pzbt" podStartSLOduration=2.220681109 podStartE2EDuration="4.884865906s" podCreationTimestamp="2025-10-01 12:49:39 +0000 UTC" firstStartedPulling="2025-10-01 12:49:40.832466088 +0000 UTC m=+759.153820925" lastFinishedPulling="2025-10-01 12:49:43.496650875 +0000 UTC m=+761.818005722" observedRunningTime="2025-10-01 12:49:43.880842569 +0000 UTC m=+762.202197406" watchObservedRunningTime="2025-10-01 12:49:43.884865906 +0000 UTC m=+762.206220733" Oct 01 12:49:45 crc kubenswrapper[4727]: I1001 12:49:45.677181 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm"] Oct 01 12:49:45 crc kubenswrapper[4727]: I1001 12:49:45.678818 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm" Oct 01 12:49:45 crc kubenswrapper[4727]: I1001 12:49:45.681265 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-t6s2h" Oct 01 12:49:45 crc kubenswrapper[4727]: I1001 12:49:45.687280 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm"] Oct 01 12:49:45 crc kubenswrapper[4727]: I1001 12:49:45.823325 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3159c1e1-b299-4837-bb69-06e886f09112-util\") pod \"e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm\" (UID: \"3159c1e1-b299-4837-bb69-06e886f09112\") " pod="openstack-operators/e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm" Oct 01 12:49:45 crc kubenswrapper[4727]: I1001 12:49:45.823414 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pt2t\" (UniqueName: \"kubernetes.io/projected/3159c1e1-b299-4837-bb69-06e886f09112-kube-api-access-5pt2t\") pod \"e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm\" (UID: \"3159c1e1-b299-4837-bb69-06e886f09112\") " pod="openstack-operators/e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm" Oct 01 12:49:45 crc kubenswrapper[4727]: I1001 12:49:45.823450 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3159c1e1-b299-4837-bb69-06e886f09112-bundle\") pod \"e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm\" (UID: \"3159c1e1-b299-4837-bb69-06e886f09112\") " pod="openstack-operators/e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm" Oct 01 12:49:45 crc kubenswrapper[4727]: I1001 12:49:45.925036 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pt2t\" (UniqueName: \"kubernetes.io/projected/3159c1e1-b299-4837-bb69-06e886f09112-kube-api-access-5pt2t\") pod \"e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm\" (UID: \"3159c1e1-b299-4837-bb69-06e886f09112\") " pod="openstack-operators/e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm" Oct 01 12:49:45 crc kubenswrapper[4727]: I1001 12:49:45.925163 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3159c1e1-b299-4837-bb69-06e886f09112-bundle\") pod \"e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm\" (UID: \"3159c1e1-b299-4837-bb69-06e886f09112\") " pod="openstack-operators/e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm" Oct 01 12:49:45 crc kubenswrapper[4727]: I1001 12:49:45.925229 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3159c1e1-b299-4837-bb69-06e886f09112-util\") pod \"e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm\" (UID: \"3159c1e1-b299-4837-bb69-06e886f09112\") " pod="openstack-operators/e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm" Oct 01 12:49:45 crc kubenswrapper[4727]: I1001 12:49:45.926060 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3159c1e1-b299-4837-bb69-06e886f09112-util\") pod \"e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm\" (UID: \"3159c1e1-b299-4837-bb69-06e886f09112\") " pod="openstack-operators/e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm" Oct 01 12:49:45 crc kubenswrapper[4727]: I1001 12:49:45.926080 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3159c1e1-b299-4837-bb69-06e886f09112-bundle\") pod \"e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm\" (UID: \"3159c1e1-b299-4837-bb69-06e886f09112\") " pod="openstack-operators/e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm" Oct 01 12:49:45 crc kubenswrapper[4727]: I1001 12:49:45.945072 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pt2t\" (UniqueName: \"kubernetes.io/projected/3159c1e1-b299-4837-bb69-06e886f09112-kube-api-access-5pt2t\") pod \"e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm\" (UID: \"3159c1e1-b299-4837-bb69-06e886f09112\") " pod="openstack-operators/e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm" Oct 01 12:49:46 crc kubenswrapper[4727]: I1001 12:49:46.001445 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm" Oct 01 12:49:46 crc kubenswrapper[4727]: I1001 12:49:46.473105 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm"] Oct 01 12:49:46 crc kubenswrapper[4727]: W1001 12:49:46.478434 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3159c1e1_b299_4837_bb69_06e886f09112.slice/crio-cca5c348078148a05697f032019fe871caeed0dc9c3499235ce364e03f5a3d67 WatchSource:0}: Error finding container cca5c348078148a05697f032019fe871caeed0dc9c3499235ce364e03f5a3d67: Status 404 returned error can't find the container with id cca5c348078148a05697f032019fe871caeed0dc9c3499235ce364e03f5a3d67 Oct 01 12:49:46 crc kubenswrapper[4727]: I1001 12:49:46.885158 4727 generic.go:334] "Generic (PLEG): container finished" podID="3159c1e1-b299-4837-bb69-06e886f09112" containerID="bd2826022ab5fc8976665af95be7f38d423ce4d38366e4e85d390efbcd277413" exitCode=0 Oct 01 12:49:46 crc kubenswrapper[4727]: I1001 12:49:46.885310 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm" event={"ID":"3159c1e1-b299-4837-bb69-06e886f09112","Type":"ContainerDied","Data":"bd2826022ab5fc8976665af95be7f38d423ce4d38366e4e85d390efbcd277413"} Oct 01 12:49:46 crc kubenswrapper[4727]: I1001 12:49:46.885477 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm" event={"ID":"3159c1e1-b299-4837-bb69-06e886f09112","Type":"ContainerStarted","Data":"cca5c348078148a05697f032019fe871caeed0dc9c3499235ce364e03f5a3d67"} Oct 01 12:49:49 crc kubenswrapper[4727]: I1001 12:49:49.570517 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2pzbt" Oct 01 12:49:49 crc kubenswrapper[4727]: I1001 12:49:49.571144 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2pzbt" Oct 01 12:49:49 crc kubenswrapper[4727]: I1001 12:49:49.609342 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2pzbt" Oct 01 12:49:49 crc kubenswrapper[4727]: I1001 12:49:49.911060 4727 generic.go:334] "Generic (PLEG): container finished" podID="3159c1e1-b299-4837-bb69-06e886f09112" containerID="30b8fdadf4b7d82345a6423d8f7950c3bda3d318f2c4b0dcc2ac85f7d33da144" exitCode=0 Oct 01 12:49:49 crc kubenswrapper[4727]: I1001 12:49:49.912083 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm" event={"ID":"3159c1e1-b299-4837-bb69-06e886f09112","Type":"ContainerDied","Data":"30b8fdadf4b7d82345a6423d8f7950c3bda3d318f2c4b0dcc2ac85f7d33da144"} Oct 01 12:49:49 crc kubenswrapper[4727]: I1001 12:49:49.953280 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2pzbt" Oct 01 12:49:50 crc kubenswrapper[4727]: I1001 12:49:50.921340 4727 generic.go:334] "Generic (PLEG): container finished" podID="3159c1e1-b299-4837-bb69-06e886f09112" containerID="1423ce9faa544204aab2bbed99ac0995b4abee1e35280c4aaf7ccab8d5d3c6e4" exitCode=0 Oct 01 12:49:50 crc kubenswrapper[4727]: I1001 12:49:50.921455 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm" event={"ID":"3159c1e1-b299-4837-bb69-06e886f09112","Type":"ContainerDied","Data":"1423ce9faa544204aab2bbed99ac0995b4abee1e35280c4aaf7ccab8d5d3c6e4"} Oct 01 12:49:51 crc kubenswrapper[4727]: I1001 12:49:51.631576 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2pzbt"] Oct 01 12:49:51 crc kubenswrapper[4727]: I1001 12:49:51.928262 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2pzbt" podUID="985279b0-dcf3-480f-a52c-1daf9fa1a0e6" containerName="registry-server" containerID="cri-o://c01df7330b0e0356a65127655527479040a637f734da13dbac4accb3d750d27d" gracePeriod=2 Oct 01 12:49:52 crc kubenswrapper[4727]: I1001 12:49:52.301956 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm" Oct 01 12:49:52 crc kubenswrapper[4727]: I1001 12:49:52.418879 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3159c1e1-b299-4837-bb69-06e886f09112-bundle\") pod \"3159c1e1-b299-4837-bb69-06e886f09112\" (UID: \"3159c1e1-b299-4837-bb69-06e886f09112\") " Oct 01 12:49:52 crc kubenswrapper[4727]: I1001 12:49:52.419023 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3159c1e1-b299-4837-bb69-06e886f09112-util\") pod \"3159c1e1-b299-4837-bb69-06e886f09112\" (UID: \"3159c1e1-b299-4837-bb69-06e886f09112\") " Oct 01 12:49:52 crc kubenswrapper[4727]: I1001 12:49:52.419569 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3159c1e1-b299-4837-bb69-06e886f09112-bundle" (OuterVolumeSpecName: "bundle") pod "3159c1e1-b299-4837-bb69-06e886f09112" (UID: "3159c1e1-b299-4837-bb69-06e886f09112"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:49:52 crc kubenswrapper[4727]: I1001 12:49:52.419883 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pt2t\" (UniqueName: \"kubernetes.io/projected/3159c1e1-b299-4837-bb69-06e886f09112-kube-api-access-5pt2t\") pod \"3159c1e1-b299-4837-bb69-06e886f09112\" (UID: \"3159c1e1-b299-4837-bb69-06e886f09112\") " Oct 01 12:49:52 crc kubenswrapper[4727]: I1001 12:49:52.421718 4727 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3159c1e1-b299-4837-bb69-06e886f09112-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:49:52 crc kubenswrapper[4727]: I1001 12:49:52.430246 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3159c1e1-b299-4837-bb69-06e886f09112-kube-api-access-5pt2t" (OuterVolumeSpecName: "kube-api-access-5pt2t") pod "3159c1e1-b299-4837-bb69-06e886f09112" (UID: "3159c1e1-b299-4837-bb69-06e886f09112"). InnerVolumeSpecName "kube-api-access-5pt2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:49:52 crc kubenswrapper[4727]: I1001 12:49:52.435890 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3159c1e1-b299-4837-bb69-06e886f09112-util" (OuterVolumeSpecName: "util") pod "3159c1e1-b299-4837-bb69-06e886f09112" (UID: "3159c1e1-b299-4837-bb69-06e886f09112"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:49:52 crc kubenswrapper[4727]: I1001 12:49:52.523428 4727 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3159c1e1-b299-4837-bb69-06e886f09112-util\") on node \"crc\" DevicePath \"\"" Oct 01 12:49:52 crc kubenswrapper[4727]: I1001 12:49:52.523471 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pt2t\" (UniqueName: \"kubernetes.io/projected/3159c1e1-b299-4837-bb69-06e886f09112-kube-api-access-5pt2t\") on node \"crc\" DevicePath \"\"" Oct 01 12:49:52 crc kubenswrapper[4727]: I1001 12:49:52.739403 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2pzbt" Oct 01 12:49:52 crc kubenswrapper[4727]: I1001 12:49:52.928111 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/985279b0-dcf3-480f-a52c-1daf9fa1a0e6-utilities\") pod \"985279b0-dcf3-480f-a52c-1daf9fa1a0e6\" (UID: \"985279b0-dcf3-480f-a52c-1daf9fa1a0e6\") " Oct 01 12:49:52 crc kubenswrapper[4727]: I1001 12:49:52.928194 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/985279b0-dcf3-480f-a52c-1daf9fa1a0e6-catalog-content\") pod \"985279b0-dcf3-480f-a52c-1daf9fa1a0e6\" (UID: \"985279b0-dcf3-480f-a52c-1daf9fa1a0e6\") " Oct 01 12:49:52 crc kubenswrapper[4727]: I1001 12:49:52.928222 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp28s\" (UniqueName: \"kubernetes.io/projected/985279b0-dcf3-480f-a52c-1daf9fa1a0e6-kube-api-access-fp28s\") pod \"985279b0-dcf3-480f-a52c-1daf9fa1a0e6\" (UID: \"985279b0-dcf3-480f-a52c-1daf9fa1a0e6\") " Oct 01 12:49:52 crc kubenswrapper[4727]: I1001 12:49:52.929412 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/985279b0-dcf3-480f-a52c-1daf9fa1a0e6-utilities" (OuterVolumeSpecName: "utilities") pod "985279b0-dcf3-480f-a52c-1daf9fa1a0e6" (UID: "985279b0-dcf3-480f-a52c-1daf9fa1a0e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:49:52 crc kubenswrapper[4727]: I1001 12:49:52.933683 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/985279b0-dcf3-480f-a52c-1daf9fa1a0e6-kube-api-access-fp28s" (OuterVolumeSpecName: "kube-api-access-fp28s") pod "985279b0-dcf3-480f-a52c-1daf9fa1a0e6" (UID: "985279b0-dcf3-480f-a52c-1daf9fa1a0e6"). InnerVolumeSpecName "kube-api-access-fp28s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:49:52 crc kubenswrapper[4727]: I1001 12:49:52.939710 4727 generic.go:334] "Generic (PLEG): container finished" podID="985279b0-dcf3-480f-a52c-1daf9fa1a0e6" containerID="c01df7330b0e0356a65127655527479040a637f734da13dbac4accb3d750d27d" exitCode=0 Oct 01 12:49:52 crc kubenswrapper[4727]: I1001 12:49:52.939792 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2pzbt" event={"ID":"985279b0-dcf3-480f-a52c-1daf9fa1a0e6","Type":"ContainerDied","Data":"c01df7330b0e0356a65127655527479040a637f734da13dbac4accb3d750d27d"} Oct 01 12:49:52 crc kubenswrapper[4727]: I1001 12:49:52.939823 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2pzbt" event={"ID":"985279b0-dcf3-480f-a52c-1daf9fa1a0e6","Type":"ContainerDied","Data":"bbcf8e73216a9603ec3c1f3d2296864e5d4e18f443c28897ccfe944b37a14c3b"} Oct 01 12:49:52 crc kubenswrapper[4727]: I1001 12:49:52.939839 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2pzbt" Oct 01 12:49:52 crc kubenswrapper[4727]: I1001 12:49:52.939843 4727 scope.go:117] "RemoveContainer" containerID="c01df7330b0e0356a65127655527479040a637f734da13dbac4accb3d750d27d" Oct 01 12:49:52 crc kubenswrapper[4727]: I1001 12:49:52.946033 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm" event={"ID":"3159c1e1-b299-4837-bb69-06e886f09112","Type":"ContainerDied","Data":"cca5c348078148a05697f032019fe871caeed0dc9c3499235ce364e03f5a3d67"} Oct 01 12:49:52 crc kubenswrapper[4727]: I1001 12:49:52.946328 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cca5c348078148a05697f032019fe871caeed0dc9c3499235ce364e03f5a3d67" Oct 01 12:49:52 crc kubenswrapper[4727]: I1001 12:49:52.946065 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm" Oct 01 12:49:52 crc kubenswrapper[4727]: I1001 12:49:52.962633 4727 scope.go:117] "RemoveContainer" containerID="006c724c60b3d1a798f87ab25effec8eee265cc78128ee617fba1480c0bab9e1" Oct 01 12:49:52 crc kubenswrapper[4727]: I1001 12:49:52.981509 4727 scope.go:117] "RemoveContainer" containerID="d92470ead1eaa04f21da89dc4af93352411023419a54a678439bbf4dc2f8991c" Oct 01 12:49:52 crc kubenswrapper[4727]: I1001 12:49:52.998913 4727 scope.go:117] "RemoveContainer" containerID="c01df7330b0e0356a65127655527479040a637f734da13dbac4accb3d750d27d" Oct 01 12:49:53 crc kubenswrapper[4727]: E1001 12:49:53.000075 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c01df7330b0e0356a65127655527479040a637f734da13dbac4accb3d750d27d\": container with ID starting with c01df7330b0e0356a65127655527479040a637f734da13dbac4accb3d750d27d not found: ID does not exist" containerID="c01df7330b0e0356a65127655527479040a637f734da13dbac4accb3d750d27d" Oct 01 12:49:53 crc kubenswrapper[4727]: I1001 12:49:53.000143 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c01df7330b0e0356a65127655527479040a637f734da13dbac4accb3d750d27d"} err="failed to get container status \"c01df7330b0e0356a65127655527479040a637f734da13dbac4accb3d750d27d\": rpc error: code = NotFound desc = could not find container \"c01df7330b0e0356a65127655527479040a637f734da13dbac4accb3d750d27d\": container with ID starting with c01df7330b0e0356a65127655527479040a637f734da13dbac4accb3d750d27d not found: ID does not exist" Oct 01 12:49:53 crc kubenswrapper[4727]: I1001 12:49:53.000180 4727 scope.go:117] "RemoveContainer" containerID="006c724c60b3d1a798f87ab25effec8eee265cc78128ee617fba1480c0bab9e1" Oct 01 12:49:53 crc kubenswrapper[4727]: E1001 12:49:53.000912 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"006c724c60b3d1a798f87ab25effec8eee265cc78128ee617fba1480c0bab9e1\": container with ID starting with 006c724c60b3d1a798f87ab25effec8eee265cc78128ee617fba1480c0bab9e1 not found: ID does not exist" containerID="006c724c60b3d1a798f87ab25effec8eee265cc78128ee617fba1480c0bab9e1" Oct 01 12:49:53 crc kubenswrapper[4727]: I1001 12:49:53.000962 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"006c724c60b3d1a798f87ab25effec8eee265cc78128ee617fba1480c0bab9e1"} err="failed to get container status \"006c724c60b3d1a798f87ab25effec8eee265cc78128ee617fba1480c0bab9e1\": rpc error: code = NotFound desc = could not find container \"006c724c60b3d1a798f87ab25effec8eee265cc78128ee617fba1480c0bab9e1\": container with ID starting with 006c724c60b3d1a798f87ab25effec8eee265cc78128ee617fba1480c0bab9e1 not found: ID does not exist" Oct 01 12:49:53 crc kubenswrapper[4727]: I1001 12:49:53.001015 4727 scope.go:117] "RemoveContainer" containerID="d92470ead1eaa04f21da89dc4af93352411023419a54a678439bbf4dc2f8991c" Oct 01 12:49:53 crc kubenswrapper[4727]: E1001 12:49:53.001485 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d92470ead1eaa04f21da89dc4af93352411023419a54a678439bbf4dc2f8991c\": container with ID starting with d92470ead1eaa04f21da89dc4af93352411023419a54a678439bbf4dc2f8991c not found: ID does not exist" containerID="d92470ead1eaa04f21da89dc4af93352411023419a54a678439bbf4dc2f8991c" Oct 01 12:49:53 crc kubenswrapper[4727]: I1001 12:49:53.001549 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d92470ead1eaa04f21da89dc4af93352411023419a54a678439bbf4dc2f8991c"} err="failed to get container status \"d92470ead1eaa04f21da89dc4af93352411023419a54a678439bbf4dc2f8991c\": rpc error: code = NotFound desc = could not find container \"d92470ead1eaa04f21da89dc4af93352411023419a54a678439bbf4dc2f8991c\": container with ID starting with d92470ead1eaa04f21da89dc4af93352411023419a54a678439bbf4dc2f8991c not found: ID does not exist" Oct 01 12:49:53 crc kubenswrapper[4727]: I1001 12:49:53.029347 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/985279b0-dcf3-480f-a52c-1daf9fa1a0e6-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:49:53 crc kubenswrapper[4727]: I1001 12:49:53.029403 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp28s\" (UniqueName: \"kubernetes.io/projected/985279b0-dcf3-480f-a52c-1daf9fa1a0e6-kube-api-access-fp28s\") on node \"crc\" DevicePath \"\"" Oct 01 12:49:53 crc kubenswrapper[4727]: I1001 12:49:53.248232 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/985279b0-dcf3-480f-a52c-1daf9fa1a0e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "985279b0-dcf3-480f-a52c-1daf9fa1a0e6" (UID: "985279b0-dcf3-480f-a52c-1daf9fa1a0e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:49:53 crc kubenswrapper[4727]: I1001 12:49:53.332873 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/985279b0-dcf3-480f-a52c-1daf9fa1a0e6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:49:53 crc kubenswrapper[4727]: I1001 12:49:53.579411 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2pzbt"] Oct 01 12:49:53 crc kubenswrapper[4727]: I1001 12:49:53.583944 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2pzbt"] Oct 01 12:49:54 crc kubenswrapper[4727]: I1001 12:49:54.383037 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="985279b0-dcf3-480f-a52c-1daf9fa1a0e6" path="/var/lib/kubelet/pods/985279b0-dcf3-480f-a52c-1daf9fa1a0e6/volumes" Oct 01 12:49:56 crc kubenswrapper[4727]: I1001 12:49:56.806910 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-c85b59bf-qns8g"] Oct 01 12:49:56 crc kubenswrapper[4727]: E1001 12:49:56.807889 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="985279b0-dcf3-480f-a52c-1daf9fa1a0e6" containerName="registry-server" Oct 01 12:49:56 crc kubenswrapper[4727]: I1001 12:49:56.807911 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="985279b0-dcf3-480f-a52c-1daf9fa1a0e6" containerName="registry-server" Oct 01 12:49:56 crc kubenswrapper[4727]: E1001 12:49:56.807931 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="985279b0-dcf3-480f-a52c-1daf9fa1a0e6" containerName="extract-content" Oct 01 12:49:56 crc kubenswrapper[4727]: I1001 12:49:56.807940 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="985279b0-dcf3-480f-a52c-1daf9fa1a0e6" containerName="extract-content" Oct 01 12:49:56 crc kubenswrapper[4727]: E1001 12:49:56.807955 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3159c1e1-b299-4837-bb69-06e886f09112" containerName="extract" Oct 01 12:49:56 crc kubenswrapper[4727]: I1001 12:49:56.807966 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="3159c1e1-b299-4837-bb69-06e886f09112" containerName="extract" Oct 01 12:49:56 crc kubenswrapper[4727]: E1001 12:49:56.807981 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="985279b0-dcf3-480f-a52c-1daf9fa1a0e6" containerName="extract-utilities" Oct 01 12:49:56 crc kubenswrapper[4727]: I1001 12:49:56.807989 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="985279b0-dcf3-480f-a52c-1daf9fa1a0e6" containerName="extract-utilities" Oct 01 12:49:56 crc kubenswrapper[4727]: E1001 12:49:56.808117 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3159c1e1-b299-4837-bb69-06e886f09112" containerName="util" Oct 01 12:49:56 crc kubenswrapper[4727]: I1001 12:49:56.808128 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="3159c1e1-b299-4837-bb69-06e886f09112" containerName="util" Oct 01 12:49:56 crc kubenswrapper[4727]: E1001 12:49:56.808141 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3159c1e1-b299-4837-bb69-06e886f09112" containerName="pull" Oct 01 12:49:56 crc kubenswrapper[4727]: I1001 12:49:56.808150 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="3159c1e1-b299-4837-bb69-06e886f09112" containerName="pull" Oct 01 12:49:56 crc kubenswrapper[4727]: I1001 12:49:56.808301 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="3159c1e1-b299-4837-bb69-06e886f09112" containerName="extract" Oct 01 12:49:56 crc kubenswrapper[4727]: I1001 12:49:56.808322 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="985279b0-dcf3-480f-a52c-1daf9fa1a0e6" containerName="registry-server" Oct 01 12:49:56 crc kubenswrapper[4727]: I1001 12:49:56.809340 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-c85b59bf-qns8g" Oct 01 12:49:56 crc kubenswrapper[4727]: I1001 12:49:56.812131 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-zptt6" Oct 01 12:49:56 crc kubenswrapper[4727]: I1001 12:49:56.843121 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-c85b59bf-qns8g"] Oct 01 12:49:56 crc kubenswrapper[4727]: I1001 12:49:56.981812 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvsk4\" (UniqueName: \"kubernetes.io/projected/007fa737-02ad-4360-8e6f-245b87f1c91d-kube-api-access-nvsk4\") pod \"openstack-operator-controller-operator-c85b59bf-qns8g\" (UID: \"007fa737-02ad-4360-8e6f-245b87f1c91d\") " pod="openstack-operators/openstack-operator-controller-operator-c85b59bf-qns8g" Oct 01 12:49:57 crc kubenswrapper[4727]: I1001 12:49:57.083352 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvsk4\" (UniqueName: \"kubernetes.io/projected/007fa737-02ad-4360-8e6f-245b87f1c91d-kube-api-access-nvsk4\") pod \"openstack-operator-controller-operator-c85b59bf-qns8g\" (UID: \"007fa737-02ad-4360-8e6f-245b87f1c91d\") " pod="openstack-operators/openstack-operator-controller-operator-c85b59bf-qns8g" Oct 01 12:49:57 crc kubenswrapper[4727]: I1001 12:49:57.103566 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvsk4\" (UniqueName: \"kubernetes.io/projected/007fa737-02ad-4360-8e6f-245b87f1c91d-kube-api-access-nvsk4\") pod \"openstack-operator-controller-operator-c85b59bf-qns8g\" (UID: \"007fa737-02ad-4360-8e6f-245b87f1c91d\") " pod="openstack-operators/openstack-operator-controller-operator-c85b59bf-qns8g" Oct 01 12:49:57 crc kubenswrapper[4727]: I1001 12:49:57.133469 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-c85b59bf-qns8g" Oct 01 12:49:57 crc kubenswrapper[4727]: I1001 12:49:57.580561 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-c85b59bf-qns8g"] Oct 01 12:49:57 crc kubenswrapper[4727]: I1001 12:49:57.987816 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-c85b59bf-qns8g" event={"ID":"007fa737-02ad-4360-8e6f-245b87f1c91d","Type":"ContainerStarted","Data":"7eacfdd40d08d5c2069951e41ac2aebcd72a5973d9be3956fa05660cfe735a6e"} Oct 01 12:50:03 crc kubenswrapper[4727]: I1001 12:50:03.030661 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-c85b59bf-qns8g" event={"ID":"007fa737-02ad-4360-8e6f-245b87f1c91d","Type":"ContainerStarted","Data":"12428d3ae3b89c83d6da34c41d336f3d498db91a282022b1faa907ca9a4a868f"} Oct 01 12:50:05 crc kubenswrapper[4727]: I1001 12:50:05.046091 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-c85b59bf-qns8g" event={"ID":"007fa737-02ad-4360-8e6f-245b87f1c91d","Type":"ContainerStarted","Data":"24574aa6d049cb7313e4effc1cf1164e9b277cb398851ecb1775295098ac4c08"} Oct 01 12:50:05 crc kubenswrapper[4727]: I1001 12:50:05.046413 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-c85b59bf-qns8g" Oct 01 12:50:05 crc kubenswrapper[4727]: I1001 12:50:05.082535 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-c85b59bf-qns8g" podStartSLOduration=2.188367585 podStartE2EDuration="9.082509526s" podCreationTimestamp="2025-10-01 12:49:56 +0000 UTC" firstStartedPulling="2025-10-01 12:49:57.591619975 +0000 UTC m=+775.912974812" lastFinishedPulling="2025-10-01 12:50:04.485761916 +0000 UTC m=+782.807116753" observedRunningTime="2025-10-01 12:50:05.075431191 +0000 UTC m=+783.396786048" watchObservedRunningTime="2025-10-01 12:50:05.082509526 +0000 UTC m=+783.403864363" Oct 01 12:50:07 crc kubenswrapper[4727]: I1001 12:50:07.136144 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-c85b59bf-qns8g" Oct 01 12:50:12 crc kubenswrapper[4727]: I1001 12:50:12.251133 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-847n2"] Oct 01 12:50:12 crc kubenswrapper[4727]: I1001 12:50:12.253198 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-847n2" Oct 01 12:50:12 crc kubenswrapper[4727]: I1001 12:50:12.265386 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-847n2"] Oct 01 12:50:12 crc kubenswrapper[4727]: I1001 12:50:12.408385 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03-catalog-content\") pod \"redhat-operators-847n2\" (UID: \"cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03\") " pod="openshift-marketplace/redhat-operators-847n2" Oct 01 12:50:12 crc kubenswrapper[4727]: I1001 12:50:12.409468 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03-utilities\") pod \"redhat-operators-847n2\" (UID: \"cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03\") " pod="openshift-marketplace/redhat-operators-847n2" Oct 01 12:50:12 crc kubenswrapper[4727]: I1001 12:50:12.409552 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htnbp\" (UniqueName: \"kubernetes.io/projected/cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03-kube-api-access-htnbp\") pod \"redhat-operators-847n2\" (UID: \"cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03\") " pod="openshift-marketplace/redhat-operators-847n2" Oct 01 12:50:12 crc kubenswrapper[4727]: I1001 12:50:12.510920 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03-catalog-content\") pod \"redhat-operators-847n2\" (UID: \"cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03\") " pod="openshift-marketplace/redhat-operators-847n2" Oct 01 12:50:12 crc kubenswrapper[4727]: I1001 12:50:12.511063 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03-utilities\") pod \"redhat-operators-847n2\" (UID: \"cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03\") " pod="openshift-marketplace/redhat-operators-847n2" Oct 01 12:50:12 crc kubenswrapper[4727]: I1001 12:50:12.511101 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htnbp\" (UniqueName: \"kubernetes.io/projected/cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03-kube-api-access-htnbp\") pod \"redhat-operators-847n2\" (UID: \"cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03\") " pod="openshift-marketplace/redhat-operators-847n2" Oct 01 12:50:12 crc kubenswrapper[4727]: I1001 12:50:12.511643 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03-utilities\") pod \"redhat-operators-847n2\" (UID: \"cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03\") " pod="openshift-marketplace/redhat-operators-847n2" Oct 01 12:50:12 crc kubenswrapper[4727]: I1001 12:50:12.511919 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03-catalog-content\") pod \"redhat-operators-847n2\" (UID: \"cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03\") " pod="openshift-marketplace/redhat-operators-847n2" Oct 01 12:50:12 crc kubenswrapper[4727]: I1001 12:50:12.531915 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htnbp\" (UniqueName: \"kubernetes.io/projected/cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03-kube-api-access-htnbp\") pod \"redhat-operators-847n2\" (UID: \"cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03\") " pod="openshift-marketplace/redhat-operators-847n2" Oct 01 12:50:12 crc kubenswrapper[4727]: I1001 12:50:12.575191 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-847n2" Oct 01 12:50:13 crc kubenswrapper[4727]: I1001 12:50:13.068398 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-847n2"] Oct 01 12:50:13 crc kubenswrapper[4727]: W1001 12:50:13.079234 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf82b2e3_4f8f_4446_b4d5_d0e6b0f8ec03.slice/crio-ca7b29864b038fa133147bff92538b68f0235a6144149a4bac2b2723a22377c6 WatchSource:0}: Error finding container ca7b29864b038fa133147bff92538b68f0235a6144149a4bac2b2723a22377c6: Status 404 returned error can't find the container with id ca7b29864b038fa133147bff92538b68f0235a6144149a4bac2b2723a22377c6 Oct 01 12:50:13 crc kubenswrapper[4727]: I1001 12:50:13.110533 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-847n2" event={"ID":"cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03","Type":"ContainerStarted","Data":"ca7b29864b038fa133147bff92538b68f0235a6144149a4bac2b2723a22377c6"} Oct 01 12:50:14 crc kubenswrapper[4727]: I1001 12:50:14.118698 4727 generic.go:334] "Generic (PLEG): container finished" podID="cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03" containerID="f58fa227f43ebf454c35a495d9cc6cb0adddc41fcb35a41aa87d1dd531c9d09a" exitCode=0 Oct 01 12:50:14 crc kubenswrapper[4727]: I1001 12:50:14.118925 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-847n2" event={"ID":"cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03","Type":"ContainerDied","Data":"f58fa227f43ebf454c35a495d9cc6cb0adddc41fcb35a41aa87d1dd531c9d09a"} Oct 01 12:50:16 crc kubenswrapper[4727]: I1001 12:50:16.137055 4727 generic.go:334] "Generic (PLEG): container finished" podID="cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03" containerID="3379e90c7535b0165e8a001f03af8ea9c74f55cd6593f6f1992c92426aa4732e" exitCode=0 Oct 01 12:50:16 crc kubenswrapper[4727]: I1001 12:50:16.137234 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-847n2" event={"ID":"cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03","Type":"ContainerDied","Data":"3379e90c7535b0165e8a001f03af8ea9c74f55cd6593f6f1992c92426aa4732e"} Oct 01 12:50:17 crc kubenswrapper[4727]: I1001 12:50:17.145600 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-847n2" event={"ID":"cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03","Type":"ContainerStarted","Data":"596c4d552438d4a610e0b1605bc342b25b0144660a8633e6d4c2c679a19677dd"} Oct 01 12:50:17 crc kubenswrapper[4727]: I1001 12:50:17.178652 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-847n2" podStartSLOduration=2.568977134 podStartE2EDuration="5.178633321s" podCreationTimestamp="2025-10-01 12:50:12 +0000 UTC" firstStartedPulling="2025-10-01 12:50:14.120947964 +0000 UTC m=+792.442302801" lastFinishedPulling="2025-10-01 12:50:16.730604151 +0000 UTC m=+795.051958988" observedRunningTime="2025-10-01 12:50:17.1726406 +0000 UTC m=+795.493995457" watchObservedRunningTime="2025-10-01 12:50:17.178633321 +0000 UTC m=+795.499988178" Oct 01 12:50:17 crc kubenswrapper[4727]: I1001 12:50:17.613287 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s2z7z"] Oct 01 12:50:17 crc kubenswrapper[4727]: I1001 12:50:17.614502 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s2z7z" Oct 01 12:50:17 crc kubenswrapper[4727]: I1001 12:50:17.626120 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s2z7z"] Oct 01 12:50:17 crc kubenswrapper[4727]: I1001 12:50:17.689736 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b8f22d2-30de-475f-8277-551a27dc6ce7-utilities\") pod \"redhat-marketplace-s2z7z\" (UID: \"6b8f22d2-30de-475f-8277-551a27dc6ce7\") " pod="openshift-marketplace/redhat-marketplace-s2z7z" Oct 01 12:50:17 crc kubenswrapper[4727]: I1001 12:50:17.689870 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b8f22d2-30de-475f-8277-551a27dc6ce7-catalog-content\") pod \"redhat-marketplace-s2z7z\" (UID: \"6b8f22d2-30de-475f-8277-551a27dc6ce7\") " pod="openshift-marketplace/redhat-marketplace-s2z7z" Oct 01 12:50:17 crc kubenswrapper[4727]: I1001 12:50:17.689983 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcx5h\" (UniqueName: \"kubernetes.io/projected/6b8f22d2-30de-475f-8277-551a27dc6ce7-kube-api-access-tcx5h\") pod \"redhat-marketplace-s2z7z\" (UID: \"6b8f22d2-30de-475f-8277-551a27dc6ce7\") " pod="openshift-marketplace/redhat-marketplace-s2z7z" Oct 01 12:50:17 crc kubenswrapper[4727]: I1001 12:50:17.792720 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcx5h\" (UniqueName: \"kubernetes.io/projected/6b8f22d2-30de-475f-8277-551a27dc6ce7-kube-api-access-tcx5h\") pod \"redhat-marketplace-s2z7z\" (UID: \"6b8f22d2-30de-475f-8277-551a27dc6ce7\") " pod="openshift-marketplace/redhat-marketplace-s2z7z" Oct 01 12:50:17 crc kubenswrapper[4727]: I1001 12:50:17.792838 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b8f22d2-30de-475f-8277-551a27dc6ce7-utilities\") pod \"redhat-marketplace-s2z7z\" (UID: \"6b8f22d2-30de-475f-8277-551a27dc6ce7\") " pod="openshift-marketplace/redhat-marketplace-s2z7z" Oct 01 12:50:17 crc kubenswrapper[4727]: I1001 12:50:17.792895 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b8f22d2-30de-475f-8277-551a27dc6ce7-catalog-content\") pod \"redhat-marketplace-s2z7z\" (UID: \"6b8f22d2-30de-475f-8277-551a27dc6ce7\") " pod="openshift-marketplace/redhat-marketplace-s2z7z" Oct 01 12:50:17 crc kubenswrapper[4727]: I1001 12:50:17.793475 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b8f22d2-30de-475f-8277-551a27dc6ce7-catalog-content\") pod \"redhat-marketplace-s2z7z\" (UID: \"6b8f22d2-30de-475f-8277-551a27dc6ce7\") " pod="openshift-marketplace/redhat-marketplace-s2z7z" Oct 01 12:50:17 crc kubenswrapper[4727]: I1001 12:50:17.794606 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b8f22d2-30de-475f-8277-551a27dc6ce7-utilities\") pod \"redhat-marketplace-s2z7z\" (UID: \"6b8f22d2-30de-475f-8277-551a27dc6ce7\") " pod="openshift-marketplace/redhat-marketplace-s2z7z" Oct 01 12:50:17 crc kubenswrapper[4727]: I1001 12:50:17.823646 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcx5h\" (UniqueName: \"kubernetes.io/projected/6b8f22d2-30de-475f-8277-551a27dc6ce7-kube-api-access-tcx5h\") pod \"redhat-marketplace-s2z7z\" (UID: \"6b8f22d2-30de-475f-8277-551a27dc6ce7\") " pod="openshift-marketplace/redhat-marketplace-s2z7z" Oct 01 12:50:17 crc kubenswrapper[4727]: I1001 12:50:17.931915 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s2z7z" Oct 01 12:50:18 crc kubenswrapper[4727]: I1001 12:50:18.312546 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s2z7z"] Oct 01 12:50:19 crc kubenswrapper[4727]: I1001 12:50:19.180978 4727 generic.go:334] "Generic (PLEG): container finished" podID="6b8f22d2-30de-475f-8277-551a27dc6ce7" containerID="bad00c9b86d92fefc924e90b2ad8dc3ae59d0925c8ab4613c1cfcd9b1d2ec2cd" exitCode=0 Oct 01 12:50:19 crc kubenswrapper[4727]: I1001 12:50:19.181051 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2z7z" event={"ID":"6b8f22d2-30de-475f-8277-551a27dc6ce7","Type":"ContainerDied","Data":"bad00c9b86d92fefc924e90b2ad8dc3ae59d0925c8ab4613c1cfcd9b1d2ec2cd"} Oct 01 12:50:19 crc kubenswrapper[4727]: I1001 12:50:19.181107 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2z7z" event={"ID":"6b8f22d2-30de-475f-8277-551a27dc6ce7","Type":"ContainerStarted","Data":"241bb4125f567c1c8729f4219575a7927f3a89f7b53e9272eeb022fd7b503778"} Oct 01 12:50:21 crc kubenswrapper[4727]: I1001 12:50:21.195307 4727 generic.go:334] "Generic (PLEG): container finished" podID="6b8f22d2-30de-475f-8277-551a27dc6ce7" containerID="2dc43babd2366e63d5fdb7774109b17aa5f48c7096f0eca313b6049c5df43b39" exitCode=0 Oct 01 12:50:21 crc kubenswrapper[4727]: I1001 12:50:21.195363 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2z7z" event={"ID":"6b8f22d2-30de-475f-8277-551a27dc6ce7","Type":"ContainerDied","Data":"2dc43babd2366e63d5fdb7774109b17aa5f48c7096f0eca313b6049c5df43b39"} Oct 01 12:50:22 crc kubenswrapper[4727]: I1001 12:50:22.202760 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2z7z" event={"ID":"6b8f22d2-30de-475f-8277-551a27dc6ce7","Type":"ContainerStarted","Data":"5ba009bf984cb867ab10b7b8db87fb9aeb13766a56b156fea719d1dd2c08fe87"} Oct 01 12:50:22 crc kubenswrapper[4727]: I1001 12:50:22.231279 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s2z7z" podStartSLOduration=2.765140662 podStartE2EDuration="5.231255643s" podCreationTimestamp="2025-10-01 12:50:17 +0000 UTC" firstStartedPulling="2025-10-01 12:50:19.185257458 +0000 UTC m=+797.506612305" lastFinishedPulling="2025-10-01 12:50:21.651372449 +0000 UTC m=+799.972727286" observedRunningTime="2025-10-01 12:50:22.226652008 +0000 UTC m=+800.548006865" watchObservedRunningTime="2025-10-01 12:50:22.231255643 +0000 UTC m=+800.552610500" Oct 01 12:50:22 crc kubenswrapper[4727]: I1001 12:50:22.576201 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-847n2" Oct 01 12:50:22 crc kubenswrapper[4727]: I1001 12:50:22.577205 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-847n2" Oct 01 12:50:22 crc kubenswrapper[4727]: I1001 12:50:22.633826 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-847n2" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.262776 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-847n2" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.731746 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-p4w7m"] Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.733874 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-p4w7m" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.735911 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-cz7xl" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.750785 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-j9gmz"] Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.752868 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-j9gmz" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.755099 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-p4w7m"] Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.757548 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-jcpl4" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.778434 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-j9gmz"] Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.785786 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-2mrqm"] Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.787096 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-2mrqm" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.797798 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-pmcrr" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.799319 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-82kh4"] Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.800481 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-82kh4" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.802768 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-r22c4" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.811510 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-2mrqm"] Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.837772 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-dvddt"] Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.838857 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-dvddt" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.842034 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-xh8pc" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.843048 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-82kh4"] Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.847878 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-dvddt"] Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.865896 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-6m8j5"] Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.867244 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-6m8j5" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.870956 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-4s6c4" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.885204 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bs6b\" (UniqueName: \"kubernetes.io/projected/8ed41f7a-f315-407e-b7a8-c5dc3fef764a-kube-api-access-7bs6b\") pod \"cinder-operator-controller-manager-644bddb6d8-j9gmz\" (UID: \"8ed41f7a-f315-407e-b7a8-c5dc3fef764a\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-j9gmz" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.885397 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svk8c\" (UniqueName: \"kubernetes.io/projected/21ca64fa-6683-4cac-97cd-32944d87bced-kube-api-access-svk8c\") pod \"barbican-operator-controller-manager-6ff8b75857-p4w7m\" (UID: \"21ca64fa-6683-4cac-97cd-32944d87bced\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-p4w7m" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.896881 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-6m8j5"] Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.902632 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-c88bk"] Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.903893 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-c88bk" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.909891 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.910092 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-kdhnq" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.910470 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-6zhw5"] Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.911480 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-6zhw5" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.917655 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-7d69g" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.921655 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-c88bk"] Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.926599 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-k7cf5"] Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.927687 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-k7cf5" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.931740 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-gw9c8" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.963805 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-6zhw5"] Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.983253 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-k7cf5"] Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.983876 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-7vqrx"] Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.995665 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7zw6\" (UniqueName: \"kubernetes.io/projected/4409e813-a7ba-440c-8ef3-22ecac8a1093-kube-api-access-t7zw6\") pod \"horizon-operator-controller-manager-9f4696d94-6m8j5\" (UID: \"4409e813-a7ba-440c-8ef3-22ecac8a1093\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-6m8j5" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.995928 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsl65\" (UniqueName: \"kubernetes.io/projected/dc459bd0-7d95-4fe6-981a-7afdb763efa8-kube-api-access-fsl65\") pod \"heat-operator-controller-manager-5d889d78cf-dvddt\" (UID: \"dc459bd0-7d95-4fe6-981a-7afdb763efa8\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-dvddt" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.996109 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svk8c\" (UniqueName: \"kubernetes.io/projected/21ca64fa-6683-4cac-97cd-32944d87bced-kube-api-access-svk8c\") pod \"barbican-operator-controller-manager-6ff8b75857-p4w7m\" (UID: \"21ca64fa-6683-4cac-97cd-32944d87bced\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-p4w7m" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.996376 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q85jh\" (UniqueName: \"kubernetes.io/projected/583d4e80-fb09-4853-8d80-9df371bf58e6-kube-api-access-q85jh\") pod \"glance-operator-controller-manager-84958c4d49-82kh4\" (UID: \"583d4e80-fb09-4853-8d80-9df371bf58e6\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-82kh4" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.996627 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwrzg\" (UniqueName: \"kubernetes.io/projected/71ad1cc3-a660-4a74-b15d-b1c7e03bf785-kube-api-access-vwrzg\") pod \"designate-operator-controller-manager-84f4f7b77b-2mrqm\" (UID: \"71ad1cc3-a660-4a74-b15d-b1c7e03bf785\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-2mrqm" Oct 01 12:50:23 crc kubenswrapper[4727]: I1001 12:50:23.996809 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bs6b\" (UniqueName: \"kubernetes.io/projected/8ed41f7a-f315-407e-b7a8-c5dc3fef764a-kube-api-access-7bs6b\") pod \"cinder-operator-controller-manager-644bddb6d8-j9gmz\" (UID: \"8ed41f7a-f315-407e-b7a8-c5dc3fef764a\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-j9gmz" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.008224 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-7vqrx" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.014400 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-tww79" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.018545 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-vgqst"] Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.070026 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-vgqst" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.081386 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svk8c\" (UniqueName: \"kubernetes.io/projected/21ca64fa-6683-4cac-97cd-32944d87bced-kube-api-access-svk8c\") pod \"barbican-operator-controller-manager-6ff8b75857-p4w7m\" (UID: \"21ca64fa-6683-4cac-97cd-32944d87bced\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-p4w7m" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.083722 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-p98cw" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.104344 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99ea0596-d1a9-434c-a176-0b4a244ecc83-cert\") pod \"infra-operator-controller-manager-9d6c5db85-c88bk\" (UID: \"99ea0596-d1a9-434c-a176-0b4a244ecc83\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-c88bk" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.104450 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtx86\" (UniqueName: \"kubernetes.io/projected/47e3cb37-ce4b-4280-9863-ad6a95b1347c-kube-api-access-mtx86\") pod \"ironic-operator-controller-manager-5cd4858477-6zhw5\" (UID: \"47e3cb37-ce4b-4280-9863-ad6a95b1347c\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-6zhw5" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.104498 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv6jc\" (UniqueName: \"kubernetes.io/projected/a5c6c947-8392-4385-9448-ca70c91635e6-kube-api-access-rv6jc\") pod \"keystone-operator-controller-manager-5bd55b4bff-k7cf5\" (UID: \"a5c6c947-8392-4385-9448-ca70c91635e6\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-k7cf5" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.104540 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54nlv\" (UniqueName: \"kubernetes.io/projected/99ea0596-d1a9-434c-a176-0b4a244ecc83-kube-api-access-54nlv\") pod \"infra-operator-controller-manager-9d6c5db85-c88bk\" (UID: \"99ea0596-d1a9-434c-a176-0b4a244ecc83\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-c88bk" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.104625 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7zw6\" (UniqueName: \"kubernetes.io/projected/4409e813-a7ba-440c-8ef3-22ecac8a1093-kube-api-access-t7zw6\") pod \"horizon-operator-controller-manager-9f4696d94-6m8j5\" (UID: \"4409e813-a7ba-440c-8ef3-22ecac8a1093\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-6m8j5" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.104671 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsl65\" (UniqueName: \"kubernetes.io/projected/dc459bd0-7d95-4fe6-981a-7afdb763efa8-kube-api-access-fsl65\") pod \"heat-operator-controller-manager-5d889d78cf-dvddt\" (UID: \"dc459bd0-7d95-4fe6-981a-7afdb763efa8\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-dvddt" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.104713 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q85jh\" (UniqueName: \"kubernetes.io/projected/583d4e80-fb09-4853-8d80-9df371bf58e6-kube-api-access-q85jh\") pod \"glance-operator-controller-manager-84958c4d49-82kh4\" (UID: \"583d4e80-fb09-4853-8d80-9df371bf58e6\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-82kh4" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.104737 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbsnl\" (UniqueName: \"kubernetes.io/projected/f31d7fb8-1ac0-4fd0-aa18-4cb9e879b506-kube-api-access-qbsnl\") pod \"mariadb-operator-controller-manager-88c7-7vqrx\" (UID: \"f31d7fb8-1ac0-4fd0-aa18-4cb9e879b506\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-7vqrx" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.104769 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwrzg\" (UniqueName: \"kubernetes.io/projected/71ad1cc3-a660-4a74-b15d-b1c7e03bf785-kube-api-access-vwrzg\") pod \"designate-operator-controller-manager-84f4f7b77b-2mrqm\" (UID: \"71ad1cc3-a660-4a74-b15d-b1c7e03bf785\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-2mrqm" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.105949 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bs6b\" (UniqueName: \"kubernetes.io/projected/8ed41f7a-f315-407e-b7a8-c5dc3fef764a-kube-api-access-7bs6b\") pod \"cinder-operator-controller-manager-644bddb6d8-j9gmz\" (UID: \"8ed41f7a-f315-407e-b7a8-c5dc3fef764a\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-j9gmz" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.106043 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-vgqst"] Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.129583 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwrzg\" (UniqueName: \"kubernetes.io/projected/71ad1cc3-a660-4a74-b15d-b1c7e03bf785-kube-api-access-vwrzg\") pod \"designate-operator-controller-manager-84f4f7b77b-2mrqm\" (UID: \"71ad1cc3-a660-4a74-b15d-b1c7e03bf785\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-2mrqm" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.133445 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-lkdzs"] Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.133752 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7zw6\" (UniqueName: \"kubernetes.io/projected/4409e813-a7ba-440c-8ef3-22ecac8a1093-kube-api-access-t7zw6\") pod \"horizon-operator-controller-manager-9f4696d94-6m8j5\" (UID: \"4409e813-a7ba-440c-8ef3-22ecac8a1093\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-6m8j5" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.134753 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-lkdzs" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.141928 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsl65\" (UniqueName: \"kubernetes.io/projected/dc459bd0-7d95-4fe6-981a-7afdb763efa8-kube-api-access-fsl65\") pod \"heat-operator-controller-manager-5d889d78cf-dvddt\" (UID: \"dc459bd0-7d95-4fe6-981a-7afdb763efa8\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-dvddt" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.143429 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-7vqrx"] Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.159785 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q85jh\" (UniqueName: \"kubernetes.io/projected/583d4e80-fb09-4853-8d80-9df371bf58e6-kube-api-access-q85jh\") pod \"glance-operator-controller-manager-84958c4d49-82kh4\" (UID: \"583d4e80-fb09-4853-8d80-9df371bf58e6\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-82kh4" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.160234 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-pthtb" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.160543 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-dvddt" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.192377 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-6m8j5" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.205989 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbsnl\" (UniqueName: \"kubernetes.io/projected/f31d7fb8-1ac0-4fd0-aa18-4cb9e879b506-kube-api-access-qbsnl\") pod \"mariadb-operator-controller-manager-88c7-7vqrx\" (UID: \"f31d7fb8-1ac0-4fd0-aa18-4cb9e879b506\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-7vqrx" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.206086 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99ea0596-d1a9-434c-a176-0b4a244ecc83-cert\") pod \"infra-operator-controller-manager-9d6c5db85-c88bk\" (UID: \"99ea0596-d1a9-434c-a176-0b4a244ecc83\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-c88bk" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.206133 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtx86\" (UniqueName: \"kubernetes.io/projected/47e3cb37-ce4b-4280-9863-ad6a95b1347c-kube-api-access-mtx86\") pod \"ironic-operator-controller-manager-5cd4858477-6zhw5\" (UID: \"47e3cb37-ce4b-4280-9863-ad6a95b1347c\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-6zhw5" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.206165 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv6jc\" (UniqueName: \"kubernetes.io/projected/a5c6c947-8392-4385-9448-ca70c91635e6-kube-api-access-rv6jc\") pod \"keystone-operator-controller-manager-5bd55b4bff-k7cf5\" (UID: \"a5c6c947-8392-4385-9448-ca70c91635e6\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-k7cf5" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.206191 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t74xq\" (UniqueName: \"kubernetes.io/projected/52bc77fe-21ba-4ac8-9fca-531e3c80432a-kube-api-access-t74xq\") pod \"manila-operator-controller-manager-6d68dbc695-vgqst\" (UID: \"52bc77fe-21ba-4ac8-9fca-531e3c80432a\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-vgqst" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.206222 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwqmz\" (UniqueName: \"kubernetes.io/projected/5e40e563-9455-43dd-a3ef-e442010c31a4-kube-api-access-wwqmz\") pod \"neutron-operator-controller-manager-849d5b9b84-lkdzs\" (UID: \"5e40e563-9455-43dd-a3ef-e442010c31a4\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-lkdzs" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.206247 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54nlv\" (UniqueName: \"kubernetes.io/projected/99ea0596-d1a9-434c-a176-0b4a244ecc83-kube-api-access-54nlv\") pod \"infra-operator-controller-manager-9d6c5db85-c88bk\" (UID: \"99ea0596-d1a9-434c-a176-0b4a244ecc83\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-c88bk" Oct 01 12:50:24 crc kubenswrapper[4727]: E1001 12:50:24.206764 4727 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 01 12:50:24 crc kubenswrapper[4727]: E1001 12:50:24.206819 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99ea0596-d1a9-434c-a176-0b4a244ecc83-cert podName:99ea0596-d1a9-434c-a176-0b4a244ecc83 nodeName:}" failed. No retries permitted until 2025-10-01 12:50:24.706803095 +0000 UTC m=+803.028157932 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/99ea0596-d1a9-434c-a176-0b4a244ecc83-cert") pod "infra-operator-controller-manager-9d6c5db85-c88bk" (UID: "99ea0596-d1a9-434c-a176-0b4a244ecc83") : secret "infra-operator-webhook-server-cert" not found Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.213964 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-lkdzs"] Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.230808 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv6jc\" (UniqueName: \"kubernetes.io/projected/a5c6c947-8392-4385-9448-ca70c91635e6-kube-api-access-rv6jc\") pod \"keystone-operator-controller-manager-5bd55b4bff-k7cf5\" (UID: \"a5c6c947-8392-4385-9448-ca70c91635e6\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-k7cf5" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.230955 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtx86\" (UniqueName: \"kubernetes.io/projected/47e3cb37-ce4b-4280-9863-ad6a95b1347c-kube-api-access-mtx86\") pod \"ironic-operator-controller-manager-5cd4858477-6zhw5\" (UID: \"47e3cb37-ce4b-4280-9863-ad6a95b1347c\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-6zhw5" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.233263 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54nlv\" (UniqueName: \"kubernetes.io/projected/99ea0596-d1a9-434c-a176-0b4a244ecc83-kube-api-access-54nlv\") pod \"infra-operator-controller-manager-9d6c5db85-c88bk\" (UID: \"99ea0596-d1a9-434c-a176-0b4a244ecc83\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-c88bk" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.240377 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbsnl\" (UniqueName: \"kubernetes.io/projected/f31d7fb8-1ac0-4fd0-aa18-4cb9e879b506-kube-api-access-qbsnl\") pod \"mariadb-operator-controller-manager-88c7-7vqrx\" (UID: \"f31d7fb8-1ac0-4fd0-aa18-4cb9e879b506\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-7vqrx" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.245164 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-x8pr2"] Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.247465 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-x8pr2" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.249023 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-6zhw5" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.253709 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-dt9z7"] Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.254766 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-dt9z7" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.256497 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-k7cf5" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.259517 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-tlk4l" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.259708 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-dssrs" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.259797 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-x8pr2"] Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.269288 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c9z78p"] Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.270737 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c9z78p" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.272979 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-dt9z7"] Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.275365 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.275643 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-mkgpl" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.282864 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-wbl5k"] Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.283876 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-wbl5k" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.286340 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-586rx" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.290076 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c9z78p"] Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.302511 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-wbl5k"] Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.307912 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t74xq\" (UniqueName: \"kubernetes.io/projected/52bc77fe-21ba-4ac8-9fca-531e3c80432a-kube-api-access-t74xq\") pod \"manila-operator-controller-manager-6d68dbc695-vgqst\" (UID: \"52bc77fe-21ba-4ac8-9fca-531e3c80432a\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-vgqst" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.307991 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwqmz\" (UniqueName: \"kubernetes.io/projected/5e40e563-9455-43dd-a3ef-e442010c31a4-kube-api-access-wwqmz\") pod \"neutron-operator-controller-manager-849d5b9b84-lkdzs\" (UID: \"5e40e563-9455-43dd-a3ef-e442010c31a4\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-lkdzs" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.360862 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-p4w7m" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.367480 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-czvw6"] Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.370864 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-j9gmz" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.371635 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-czvw6" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.375899 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-z8gml" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.376332 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t74xq\" (UniqueName: \"kubernetes.io/projected/52bc77fe-21ba-4ac8-9fca-531e3c80432a-kube-api-access-t74xq\") pod \"manila-operator-controller-manager-6d68dbc695-vgqst\" (UID: \"52bc77fe-21ba-4ac8-9fca-531e3c80432a\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-vgqst" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.376919 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwqmz\" (UniqueName: \"kubernetes.io/projected/5e40e563-9455-43dd-a3ef-e442010c31a4-kube-api-access-wwqmz\") pod \"neutron-operator-controller-manager-849d5b9b84-lkdzs\" (UID: \"5e40e563-9455-43dd-a3ef-e442010c31a4\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-lkdzs" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.408856 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-2mrqm" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.409306 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5zsk\" (UniqueName: \"kubernetes.io/projected/4924da7d-07e9-4378-9965-c3e85c3018c8-kube-api-access-m5zsk\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8c9z78p\" (UID: \"4924da7d-07e9-4378-9965-c3e85c3018c8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c9z78p" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.409342 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-6l8fp"] Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.409356 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q58qs\" (UniqueName: \"kubernetes.io/projected/b1322ef4-b813-41a1-a851-d9e96e4cf7ef-kube-api-access-q58qs\") pod \"ovn-operator-controller-manager-9976ff44c-wbl5k\" (UID: \"b1322ef4-b813-41a1-a851-d9e96e4cf7ef\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-wbl5k" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.409394 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkkzv\" (UniqueName: \"kubernetes.io/projected/18ea0de4-19a4-4417-a13e-bec65f0cfc31-kube-api-access-jkkzv\") pod \"nova-operator-controller-manager-64cd67b5cb-dt9z7\" (UID: \"18ea0de4-19a4-4417-a13e-bec65f0cfc31\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-dt9z7" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.409460 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8g5k\" (UniqueName: \"kubernetes.io/projected/7c69585d-d708-4863-9cdf-bace662d6658-kube-api-access-h8g5k\") pod \"octavia-operator-controller-manager-7b787867f4-x8pr2\" (UID: \"7c69585d-d708-4863-9cdf-bace662d6658\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-x8pr2" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.409494 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4924da7d-07e9-4378-9965-c3e85c3018c8-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8c9z78p\" (UID: \"4924da7d-07e9-4378-9965-c3e85c3018c8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c9z78p" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.410526 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-6l8fp" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.413557 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-t9zxq" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.420539 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-82kh4" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.431630 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-czvw6"] Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.438519 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-7vqrx" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.439113 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-6l8fp"] Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.450987 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b5b89c9dd-6c9pp"] Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.453406 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b5b89c9dd-6c9pp" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.455074 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-7zbj5" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.464659 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b5b89c9dd-6c9pp"] Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.518260 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-vgqst" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.519375 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4924da7d-07e9-4378-9965-c3e85c3018c8-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8c9z78p\" (UID: \"4924da7d-07e9-4378-9965-c3e85c3018c8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c9z78p" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.527322 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5zsk\" (UniqueName: \"kubernetes.io/projected/4924da7d-07e9-4378-9965-c3e85c3018c8-kube-api-access-m5zsk\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8c9z78p\" (UID: \"4924da7d-07e9-4378-9965-c3e85c3018c8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c9z78p" Oct 01 12:50:24 crc kubenswrapper[4727]: E1001 12:50:24.519492 4727 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 01 12:50:24 crc kubenswrapper[4727]: E1001 12:50:24.536168 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4924da7d-07e9-4378-9965-c3e85c3018c8-cert podName:4924da7d-07e9-4378-9965-c3e85c3018c8 nodeName:}" failed. No retries permitted until 2025-10-01 12:50:25.036139928 +0000 UTC m=+803.357494765 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4924da7d-07e9-4378-9965-c3e85c3018c8-cert") pod "openstack-baremetal-operator-controller-manager-77b9676b8c9z78p" (UID: "4924da7d-07e9-4378-9965-c3e85c3018c8") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.536552 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-lkdzs" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.537738 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q58qs\" (UniqueName: \"kubernetes.io/projected/b1322ef4-b813-41a1-a851-d9e96e4cf7ef-kube-api-access-q58qs\") pod \"ovn-operator-controller-manager-9976ff44c-wbl5k\" (UID: \"b1322ef4-b813-41a1-a851-d9e96e4cf7ef\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-wbl5k" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.537851 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkkzv\" (UniqueName: \"kubernetes.io/projected/18ea0de4-19a4-4417-a13e-bec65f0cfc31-kube-api-access-jkkzv\") pod \"nova-operator-controller-manager-64cd67b5cb-dt9z7\" (UID: \"18ea0de4-19a4-4417-a13e-bec65f0cfc31\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-dt9z7" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.537988 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5s66\" (UniqueName: \"kubernetes.io/projected/43e69ea0-ecf5-40a9-ae20-94ac949ebfeb-kube-api-access-z5s66\") pod \"swift-operator-controller-manager-84d6b4b759-czvw6\" (UID: \"43e69ea0-ecf5-40a9-ae20-94ac949ebfeb\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-czvw6" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.538038 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8g5k\" (UniqueName: \"kubernetes.io/projected/7c69585d-d708-4863-9cdf-bace662d6658-kube-api-access-h8g5k\") pod \"octavia-operator-controller-manager-7b787867f4-x8pr2\" (UID: \"7c69585d-d708-4863-9cdf-bace662d6658\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-x8pr2" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.538110 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tc2g\" (UniqueName: \"kubernetes.io/projected/7f874b80-31cc-4c3a-9506-999fb72deac5-kube-api-access-4tc2g\") pod \"placement-operator-controller-manager-589c58c6c-6l8fp\" (UID: \"7f874b80-31cc-4c3a-9506-999fb72deac5\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-6l8fp" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.554106 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-smzzs"] Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.555639 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-85777745bb-smzzs" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.569517 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-fhhxr" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.575433 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8g5k\" (UniqueName: \"kubernetes.io/projected/7c69585d-d708-4863-9cdf-bace662d6658-kube-api-access-h8g5k\") pod \"octavia-operator-controller-manager-7b787867f4-x8pr2\" (UID: \"7c69585d-d708-4863-9cdf-bace662d6658\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-x8pr2" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.582780 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5zsk\" (UniqueName: \"kubernetes.io/projected/4924da7d-07e9-4378-9965-c3e85c3018c8-kube-api-access-m5zsk\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8c9z78p\" (UID: \"4924da7d-07e9-4378-9965-c3e85c3018c8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c9z78p" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.584288 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-smzzs"] Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.589175 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkkzv\" (UniqueName: \"kubernetes.io/projected/18ea0de4-19a4-4417-a13e-bec65f0cfc31-kube-api-access-jkkzv\") pod \"nova-operator-controller-manager-64cd67b5cb-dt9z7\" (UID: \"18ea0de4-19a4-4417-a13e-bec65f0cfc31\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-dt9z7" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.589799 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q58qs\" (UniqueName: \"kubernetes.io/projected/b1322ef4-b813-41a1-a851-d9e96e4cf7ef-kube-api-access-q58qs\") pod \"ovn-operator-controller-manager-9976ff44c-wbl5k\" (UID: \"b1322ef4-b813-41a1-a851-d9e96e4cf7ef\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-wbl5k" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.593633 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-x8pr2" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.625476 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-dt9z7" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.667739 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9pm6\" (UniqueName: \"kubernetes.io/projected/cc1db3cf-e8c2-4209-9d01-bb825fb693d6-kube-api-access-w9pm6\") pod \"telemetry-operator-controller-manager-b5b89c9dd-6c9pp\" (UID: \"cc1db3cf-e8c2-4209-9d01-bb825fb693d6\") " pod="openstack-operators/telemetry-operator-controller-manager-b5b89c9dd-6c9pp" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.667924 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5s66\" (UniqueName: \"kubernetes.io/projected/43e69ea0-ecf5-40a9-ae20-94ac949ebfeb-kube-api-access-z5s66\") pod \"swift-operator-controller-manager-84d6b4b759-czvw6\" (UID: \"43e69ea0-ecf5-40a9-ae20-94ac949ebfeb\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-czvw6" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.667978 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tc2g\" (UniqueName: \"kubernetes.io/projected/7f874b80-31cc-4c3a-9506-999fb72deac5-kube-api-access-4tc2g\") pod \"placement-operator-controller-manager-589c58c6c-6l8fp\" (UID: \"7f874b80-31cc-4c3a-9506-999fb72deac5\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-6l8fp" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.711156 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5s66\" (UniqueName: \"kubernetes.io/projected/43e69ea0-ecf5-40a9-ae20-94ac949ebfeb-kube-api-access-z5s66\") pod \"swift-operator-controller-manager-84d6b4b759-czvw6\" (UID: \"43e69ea0-ecf5-40a9-ae20-94ac949ebfeb\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-czvw6" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.724470 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-wbl5k" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.774097 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9pm6\" (UniqueName: \"kubernetes.io/projected/cc1db3cf-e8c2-4209-9d01-bb825fb693d6-kube-api-access-w9pm6\") pod \"telemetry-operator-controller-manager-b5b89c9dd-6c9pp\" (UID: \"cc1db3cf-e8c2-4209-9d01-bb825fb693d6\") " pod="openstack-operators/telemetry-operator-controller-manager-b5b89c9dd-6c9pp" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.774179 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99ea0596-d1a9-434c-a176-0b4a244ecc83-cert\") pod \"infra-operator-controller-manager-9d6c5db85-c88bk\" (UID: \"99ea0596-d1a9-434c-a176-0b4a244ecc83\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-c88bk" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.774222 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crzf8\" (UniqueName: \"kubernetes.io/projected/cd54773a-d526-46e2-a6bd-703886de898c-kube-api-access-crzf8\") pod \"test-operator-controller-manager-85777745bb-smzzs\" (UID: \"cd54773a-d526-46e2-a6bd-703886de898c\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-smzzs" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.775227 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tc2g\" (UniqueName: \"kubernetes.io/projected/7f874b80-31cc-4c3a-9506-999fb72deac5-kube-api-access-4tc2g\") pod \"placement-operator-controller-manager-589c58c6c-6l8fp\" (UID: \"7f874b80-31cc-4c3a-9506-999fb72deac5\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-6l8fp" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.781908 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-czvw6" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.784648 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99ea0596-d1a9-434c-a176-0b4a244ecc83-cert\") pod \"infra-operator-controller-manager-9d6c5db85-c88bk\" (UID: \"99ea0596-d1a9-434c-a176-0b4a244ecc83\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-c88bk" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.794621 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9pm6\" (UniqueName: \"kubernetes.io/projected/cc1db3cf-e8c2-4209-9d01-bb825fb693d6-kube-api-access-w9pm6\") pod \"telemetry-operator-controller-manager-b5b89c9dd-6c9pp\" (UID: \"cc1db3cf-e8c2-4209-9d01-bb825fb693d6\") " pod="openstack-operators/telemetry-operator-controller-manager-b5b89c9dd-6c9pp" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.828479 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-6l8fp" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.829093 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-c88bk" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.837653 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9957f54f-kcsrk"] Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.840163 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-kcsrk" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.849337 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b5b89c9dd-6c9pp" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.884571 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-f56xr" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.887888 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mbb9\" (UniqueName: \"kubernetes.io/projected/2325c2e9-2f53-48b4-8dfb-bc1089a0caab-kube-api-access-8mbb9\") pod \"watcher-operator-controller-manager-6b9957f54f-kcsrk\" (UID: \"2325c2e9-2f53-48b4-8dfb-bc1089a0caab\") " pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-kcsrk" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.888101 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crzf8\" (UniqueName: \"kubernetes.io/projected/cd54773a-d526-46e2-a6bd-703886de898c-kube-api-access-crzf8\") pod \"test-operator-controller-manager-85777745bb-smzzs\" (UID: \"cd54773a-d526-46e2-a6bd-703886de898c\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-smzzs" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.918326 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9957f54f-kcsrk"] Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.967735 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crzf8\" (UniqueName: \"kubernetes.io/projected/cd54773a-d526-46e2-a6bd-703886de898c-kube-api-access-crzf8\") pod \"test-operator-controller-manager-85777745bb-smzzs\" (UID: \"cd54773a-d526-46e2-a6bd-703886de898c\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-smzzs" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.972316 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-85777745bb-smzzs" Oct 01 12:50:24 crc kubenswrapper[4727]: I1001 12:50:24.989412 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mbb9\" (UniqueName: \"kubernetes.io/projected/2325c2e9-2f53-48b4-8dfb-bc1089a0caab-kube-api-access-8mbb9\") pod \"watcher-operator-controller-manager-6b9957f54f-kcsrk\" (UID: \"2325c2e9-2f53-48b4-8dfb-bc1089a0caab\") " pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-kcsrk" Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.033909 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mbb9\" (UniqueName: \"kubernetes.io/projected/2325c2e9-2f53-48b4-8dfb-bc1089a0caab-kube-api-access-8mbb9\") pod \"watcher-operator-controller-manager-6b9957f54f-kcsrk\" (UID: \"2325c2e9-2f53-48b4-8dfb-bc1089a0caab\") " pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-kcsrk" Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.073656 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5db568f97f-zfnfz"] Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.075152 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5db568f97f-zfnfz" Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.080174 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5db568f97f-zfnfz"] Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.084379 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-b6k7m" Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.084555 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.088203 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kbtpt"] Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.089396 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kbtpt" Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.091117 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kbtpt"] Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.091709 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4924da7d-07e9-4378-9965-c3e85c3018c8-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8c9z78p\" (UID: \"4924da7d-07e9-4378-9965-c3e85c3018c8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c9z78p" Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.091822 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6bc6c09-3c9e-4de0-bf11-239a93867c74-cert\") pod \"openstack-operator-controller-manager-5db568f97f-zfnfz\" (UID: \"d6bc6c09-3c9e-4de0-bf11-239a93867c74\") " pod="openstack-operators/openstack-operator-controller-manager-5db568f97f-zfnfz" Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.091853 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twtv5\" (UniqueName: \"kubernetes.io/projected/d6bc6c09-3c9e-4de0-bf11-239a93867c74-kube-api-access-twtv5\") pod \"openstack-operator-controller-manager-5db568f97f-zfnfz\" (UID: \"d6bc6c09-3c9e-4de0-bf11-239a93867c74\") " pod="openstack-operators/openstack-operator-controller-manager-5db568f97f-zfnfz" Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.091880 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc6pf\" (UniqueName: \"kubernetes.io/projected/d823b105-b073-44a4-9a1f-eb067b981295-kube-api-access-kc6pf\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-kbtpt\" (UID: \"d823b105-b073-44a4-9a1f-eb067b981295\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kbtpt" Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.094464 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-ldqvk" Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.101641 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-847n2"] Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.113104 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4924da7d-07e9-4378-9965-c3e85c3018c8-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8c9z78p\" (UID: \"4924da7d-07e9-4378-9965-c3e85c3018c8\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c9z78p" Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.194457 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6bc6c09-3c9e-4de0-bf11-239a93867c74-cert\") pod \"openstack-operator-controller-manager-5db568f97f-zfnfz\" (UID: \"d6bc6c09-3c9e-4de0-bf11-239a93867c74\") " pod="openstack-operators/openstack-operator-controller-manager-5db568f97f-zfnfz" Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.194526 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twtv5\" (UniqueName: \"kubernetes.io/projected/d6bc6c09-3c9e-4de0-bf11-239a93867c74-kube-api-access-twtv5\") pod \"openstack-operator-controller-manager-5db568f97f-zfnfz\" (UID: \"d6bc6c09-3c9e-4de0-bf11-239a93867c74\") " pod="openstack-operators/openstack-operator-controller-manager-5db568f97f-zfnfz" Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.194563 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc6pf\" (UniqueName: \"kubernetes.io/projected/d823b105-b073-44a4-9a1f-eb067b981295-kube-api-access-kc6pf\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-kbtpt\" (UID: \"d823b105-b073-44a4-9a1f-eb067b981295\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kbtpt" Oct 01 12:50:25 crc kubenswrapper[4727]: E1001 12:50:25.194827 4727 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 01 12:50:25 crc kubenswrapper[4727]: E1001 12:50:25.194908 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6bc6c09-3c9e-4de0-bf11-239a93867c74-cert podName:d6bc6c09-3c9e-4de0-bf11-239a93867c74 nodeName:}" failed. No retries permitted until 2025-10-01 12:50:25.694884555 +0000 UTC m=+804.016239442 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d6bc6c09-3c9e-4de0-bf11-239a93867c74-cert") pod "openstack-operator-controller-manager-5db568f97f-zfnfz" (UID: "d6bc6c09-3c9e-4de0-bf11-239a93867c74") : secret "webhook-server-cert" not found Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.230967 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc6pf\" (UniqueName: \"kubernetes.io/projected/d823b105-b073-44a4-9a1f-eb067b981295-kube-api-access-kc6pf\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-kbtpt\" (UID: \"d823b105-b073-44a4-9a1f-eb067b981295\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kbtpt" Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.239625 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-dvddt"] Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.242951 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twtv5\" (UniqueName: \"kubernetes.io/projected/d6bc6c09-3c9e-4de0-bf11-239a93867c74-kube-api-access-twtv5\") pod \"openstack-operator-controller-manager-5db568f97f-zfnfz\" (UID: \"d6bc6c09-3c9e-4de0-bf11-239a93867c74\") " pod="openstack-operators/openstack-operator-controller-manager-5db568f97f-zfnfz" Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.243281 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-kcsrk" Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.258823 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c9z78p" Oct 01 12:50:25 crc kubenswrapper[4727]: W1001 12:50:25.384851 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc459bd0_7d95_4fe6_981a_7afdb763efa8.slice/crio-77c9f318ed7f0c3b0cd64d46ff395dc096365ff586be1b9b5cb3f5ed22326bcf WatchSource:0}: Error finding container 77c9f318ed7f0c3b0cd64d46ff395dc096365ff586be1b9b5cb3f5ed22326bcf: Status 404 returned error can't find the container with id 77c9f318ed7f0c3b0cd64d46ff395dc096365ff586be1b9b5cb3f5ed22326bcf Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.471106 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kbtpt" Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.645081 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-6m8j5"] Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.713493 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6bc6c09-3c9e-4de0-bf11-239a93867c74-cert\") pod \"openstack-operator-controller-manager-5db568f97f-zfnfz\" (UID: \"d6bc6c09-3c9e-4de0-bf11-239a93867c74\") " pod="openstack-operators/openstack-operator-controller-manager-5db568f97f-zfnfz" Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.719032 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6bc6c09-3c9e-4de0-bf11-239a93867c74-cert\") pod \"openstack-operator-controller-manager-5db568f97f-zfnfz\" (UID: \"d6bc6c09-3c9e-4de0-bf11-239a93867c74\") " pod="openstack-operators/openstack-operator-controller-manager-5db568f97f-zfnfz" Oct 01 12:50:25 crc kubenswrapper[4727]: I1001 12:50:25.762520 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5db568f97f-zfnfz" Oct 01 12:50:26 crc kubenswrapper[4727]: I1001 12:50:26.248177 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-6m8j5" event={"ID":"4409e813-a7ba-440c-8ef3-22ecac8a1093","Type":"ContainerStarted","Data":"d231aa9906dfffaf160c4892cac6c3bb49c6e32c1f063904c7aa91f999cb6d5d"} Oct 01 12:50:26 crc kubenswrapper[4727]: I1001 12:50:26.252389 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-dvddt" event={"ID":"dc459bd0-7d95-4fe6-981a-7afdb763efa8","Type":"ContainerStarted","Data":"77c9f318ed7f0c3b0cd64d46ff395dc096365ff586be1b9b5cb3f5ed22326bcf"} Oct 01 12:50:26 crc kubenswrapper[4727]: I1001 12:50:26.252564 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-847n2" podUID="cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03" containerName="registry-server" containerID="cri-o://596c4d552438d4a610e0b1605bc342b25b0144660a8633e6d4c2c679a19677dd" gracePeriod=2 Oct 01 12:50:26 crc kubenswrapper[4727]: I1001 12:50:26.300470 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-p4w7m"] Oct 01 12:50:26 crc kubenswrapper[4727]: I1001 12:50:26.306142 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-k7cf5"] Oct 01 12:50:26 crc kubenswrapper[4727]: I1001 12:50:26.347338 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-j9gmz"] Oct 01 12:50:26 crc kubenswrapper[4727]: I1001 12:50:26.371892 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-2mrqm"] Oct 01 12:50:26 crc kubenswrapper[4727]: I1001 12:50:26.430564 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-6zhw5"] Oct 01 12:50:26 crc kubenswrapper[4727]: I1001 12:50:26.430840 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-dt9z7"] Oct 01 12:50:26 crc kubenswrapper[4727]: I1001 12:50:26.432464 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-82kh4"] Oct 01 12:50:26 crc kubenswrapper[4727]: W1001 12:50:26.441652 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52bc77fe_21ba_4ac8_9fca_531e3c80432a.slice/crio-4403c82a58c2281b364582589a3c74384b209c8a50740f5565b4549f52f9224b WatchSource:0}: Error finding container 4403c82a58c2281b364582589a3c74384b209c8a50740f5565b4549f52f9224b: Status 404 returned error can't find the container with id 4403c82a58c2281b364582589a3c74384b209c8a50740f5565b4549f52f9224b Oct 01 12:50:26 crc kubenswrapper[4727]: W1001 12:50:26.442520 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18ea0de4_19a4_4417_a13e_bec65f0cfc31.slice/crio-25cef493e416e0870b6906fc41e7d9c3c3bb8338022783f3a843c143db19cbc5 WatchSource:0}: Error finding container 25cef493e416e0870b6906fc41e7d9c3c3bb8338022783f3a843c143db19cbc5: Status 404 returned error can't find the container with id 25cef493e416e0870b6906fc41e7d9c3c3bb8338022783f3a843c143db19cbc5 Oct 01 12:50:26 crc kubenswrapper[4727]: W1001 12:50:26.445440 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1322ef4_b813_41a1_a851_d9e96e4cf7ef.slice/crio-25d47ec51e3d7c7d563ddbc72f27ce695d66698e9ab9393168d402359dc62a7d WatchSource:0}: Error finding container 25d47ec51e3d7c7d563ddbc72f27ce695d66698e9ab9393168d402359dc62a7d: Status 404 returned error can't find the container with id 25d47ec51e3d7c7d563ddbc72f27ce695d66698e9ab9393168d402359dc62a7d Oct 01 12:50:26 crc kubenswrapper[4727]: I1001 12:50:26.447757 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-x8pr2"] Oct 01 12:50:26 crc kubenswrapper[4727]: W1001 12:50:26.451536 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod583d4e80_fb09_4853_8d80_9df371bf58e6.slice/crio-b9a7421f6cab0a93ad5c04e5816f26bbbc9c75935afdfb65d59512ae8ec9b4ed WatchSource:0}: Error finding container b9a7421f6cab0a93ad5c04e5816f26bbbc9c75935afdfb65d59512ae8ec9b4ed: Status 404 returned error can't find the container with id b9a7421f6cab0a93ad5c04e5816f26bbbc9c75935afdfb65d59512ae8ec9b4ed Oct 01 12:50:26 crc kubenswrapper[4727]: W1001 12:50:26.458268 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf31d7fb8_1ac0_4fd0_aa18_4cb9e879b506.slice/crio-00c31423422cfa08ae924b0d365b5f9da0191ec0ba4f59c9faba2eec657f4ec6 WatchSource:0}: Error finding container 00c31423422cfa08ae924b0d365b5f9da0191ec0ba4f59c9faba2eec657f4ec6: Status 404 returned error can't find the container with id 00c31423422cfa08ae924b0d365b5f9da0191ec0ba4f59c9faba2eec657f4ec6 Oct 01 12:50:26 crc kubenswrapper[4727]: W1001 12:50:26.459506 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c69585d_d708_4863_9cdf_bace662d6658.slice/crio-b3ceae11a5edd711a6776fd502744ce8d96fda3709de0378fd2933c064f55f0b WatchSource:0}: Error finding container b3ceae11a5edd711a6776fd502744ce8d96fda3709de0378fd2933c064f55f0b: Status 404 returned error can't find the container with id b3ceae11a5edd711a6776fd502744ce8d96fda3709de0378fd2933c064f55f0b Oct 01 12:50:26 crc kubenswrapper[4727]: I1001 12:50:26.461770 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-vgqst"] Oct 01 12:50:26 crc kubenswrapper[4727]: E1001 12:50:26.464362 4727 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:15d7b5a365350a831ca59d984df67fadeccf89d599e487a7597b105afb82ce4a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qbsnl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-88c7-7vqrx_openstack-operators(f31d7fb8-1ac0-4fd0-aa18-4cb9e879b506): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 12:50:26 crc kubenswrapper[4727]: E1001 12:50:26.464713 4727 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:e1328760310f3bbf4548b8b1268cd711087dd91212b92bb0be287cad1f1b6fe9,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h8g5k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7b787867f4-x8pr2_openstack-operators(7c69585d-d708-4863-9cdf-bace662d6658): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 12:50:26 crc kubenswrapper[4727]: E1001 12:50:26.465677 4727 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:acdeebaa51f962066f42f38b6c2d34a62fc6a24f58f9ee63d61b1e0cafbb29f8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wwqmz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-849d5b9b84-lkdzs_openstack-operators(5e40e563-9455-43dd-a3ef-e442010c31a4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 12:50:26 crc kubenswrapper[4727]: E1001 12:50:26.465729 4727 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-54nlv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-9d6c5db85-c88bk_openstack-operators(99ea0596-d1a9-434c-a176-0b4a244ecc83): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 12:50:26 crc kubenswrapper[4727]: I1001 12:50:26.491438 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-czvw6"] Oct 01 12:50:26 crc kubenswrapper[4727]: I1001 12:50:26.515614 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-wbl5k"] Oct 01 12:50:26 crc kubenswrapper[4727]: I1001 12:50:26.522549 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-7vqrx"] Oct 01 12:50:26 crc kubenswrapper[4727]: I1001 12:50:26.528510 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-lkdzs"] Oct 01 12:50:26 crc kubenswrapper[4727]: I1001 12:50:26.532533 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-c88bk"] Oct 01 12:50:26 crc kubenswrapper[4727]: I1001 12:50:26.650989 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9957f54f-kcsrk"] Oct 01 12:50:26 crc kubenswrapper[4727]: I1001 12:50:26.671907 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5db568f97f-zfnfz"] Oct 01 12:50:26 crc kubenswrapper[4727]: W1001 12:50:26.681212 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6bc6c09_3c9e_4de0_bf11_239a93867c74.slice/crio-9015817332cabbe6416bf7f9e935ae56300c0a82caed55909f275c1043642295 WatchSource:0}: Error finding container 9015817332cabbe6416bf7f9e935ae56300c0a82caed55909f275c1043642295: Status 404 returned error can't find the container with id 9015817332cabbe6416bf7f9e935ae56300c0a82caed55909f275c1043642295 Oct 01 12:50:26 crc kubenswrapper[4727]: I1001 12:50:26.689481 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kbtpt"] Oct 01 12:50:26 crc kubenswrapper[4727]: I1001 12:50:26.695195 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-6l8fp"] Oct 01 12:50:26 crc kubenswrapper[4727]: I1001 12:50:26.700666 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b5b89c9dd-6c9pp"] Oct 01 12:50:26 crc kubenswrapper[4727]: E1001 12:50:26.701918 4727 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4tc2g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-589c58c6c-6l8fp_openstack-operators(7f874b80-31cc-4c3a-9506-999fb72deac5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 12:50:26 crc kubenswrapper[4727]: I1001 12:50:26.708917 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-smzzs"] Oct 01 12:50:26 crc kubenswrapper[4727]: E1001 12:50:26.714350 4727 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.18:5001/openstack-k8s-operators/telemetry-operator:bf75b048435f0292acc396dd25db2ce979264c1e,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w9pm6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-b5b89c9dd-6c9pp_openstack-operators(cc1db3cf-e8c2-4209-9d01-bb825fb693d6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 12:50:26 crc kubenswrapper[4727]: E1001 12:50:26.714846 4727 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kc6pf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-kbtpt_openstack-operators(d823b105-b073-44a4-9a1f-eb067b981295): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 12:50:26 crc kubenswrapper[4727]: E1001 12:50:26.715704 4727 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-crzf8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-85777745bb-smzzs_openstack-operators(cd54773a-d526-46e2-a6bd-703886de898c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 12:50:26 crc kubenswrapper[4727]: E1001 12:50:26.715959 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kbtpt" podUID="d823b105-b073-44a4-9a1f-eb067b981295" Oct 01 12:50:26 crc kubenswrapper[4727]: I1001 12:50:26.716468 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c9z78p"] Oct 01 12:50:26 crc kubenswrapper[4727]: E1001 12:50:26.717273 4727 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_LIGHTSPEED_IMAGE_URL_DEFAULT,Value:quay.io/openstack-lightspeed/rag-content:os-docs-2024.2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m5zsk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-77b9676b8c9z78p_openstack-operators(4924da7d-07e9-4378-9965-c3e85c3018c8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 12:50:27 crc kubenswrapper[4727]: E1001 12:50:27.053545 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-lkdzs" podUID="5e40e563-9455-43dd-a3ef-e442010c31a4" Oct 01 12:50:27 crc kubenswrapper[4727]: E1001 12:50:27.062603 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c9z78p" podUID="4924da7d-07e9-4378-9965-c3e85c3018c8" Oct 01 12:50:27 crc kubenswrapper[4727]: E1001 12:50:27.062701 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-85777745bb-smzzs" podUID="cd54773a-d526-46e2-a6bd-703886de898c" Oct 01 12:50:27 crc kubenswrapper[4727]: E1001 12:50:27.062795 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-b5b89c9dd-6c9pp" podUID="cc1db3cf-e8c2-4209-9d01-bb825fb693d6" Oct 01 12:50:27 crc kubenswrapper[4727]: E1001 12:50:27.062872 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-c88bk" podUID="99ea0596-d1a9-434c-a176-0b4a244ecc83" Oct 01 12:50:27 crc kubenswrapper[4727]: E1001 12:50:27.063371 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-88c7-7vqrx" podUID="f31d7fb8-1ac0-4fd0-aa18-4cb9e879b506" Oct 01 12:50:27 crc kubenswrapper[4727]: E1001 12:50:27.063705 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-6l8fp" podUID="7f874b80-31cc-4c3a-9506-999fb72deac5" Oct 01 12:50:27 crc kubenswrapper[4727]: E1001 12:50:27.073700 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-x8pr2" podUID="7c69585d-d708-4863-9cdf-bace662d6658" Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.261388 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b5b89c9dd-6c9pp" event={"ID":"cc1db3cf-e8c2-4209-9d01-bb825fb693d6","Type":"ContainerStarted","Data":"15ddecfc23b30105720152312d44906f08ea1ee4b996a60dd249d034e4b24c09"} Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.261742 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b5b89c9dd-6c9pp" event={"ID":"cc1db3cf-e8c2-4209-9d01-bb825fb693d6","Type":"ContainerStarted","Data":"6d8c300074e9b6ffaba12ffd4257dd2759b8835b0012e50ae2ba3fba0a17df62"} Oct 01 12:50:27 crc kubenswrapper[4727]: E1001 12:50:27.262650 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.18:5001/openstack-k8s-operators/telemetry-operator:bf75b048435f0292acc396dd25db2ce979264c1e\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-b5b89c9dd-6c9pp" podUID="cc1db3cf-e8c2-4209-9d01-bb825fb693d6" Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.262885 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-82kh4" event={"ID":"583d4e80-fb09-4853-8d80-9df371bf58e6","Type":"ContainerStarted","Data":"b9a7421f6cab0a93ad5c04e5816f26bbbc9c75935afdfb65d59512ae8ec9b4ed"} Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.269182 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-smzzs" event={"ID":"cd54773a-d526-46e2-a6bd-703886de898c","Type":"ContainerStarted","Data":"0a6341385c3f0d838d055aa4c4a033791e160ec57d9077c423069f9397cf03f0"} Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.269446 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-smzzs" event={"ID":"cd54773a-d526-46e2-a6bd-703886de898c","Type":"ContainerStarted","Data":"cf37e4ba2f39c5b74cd4ab8366dae948a0047bc4700ee73ec169aeb49528da98"} Oct 01 12:50:27 crc kubenswrapper[4727]: E1001 12:50:27.273184 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3\\\"\"" pod="openstack-operators/test-operator-controller-manager-85777745bb-smzzs" podUID="cd54773a-d526-46e2-a6bd-703886de898c" Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.276215 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-lkdzs" event={"ID":"5e40e563-9455-43dd-a3ef-e442010c31a4","Type":"ContainerStarted","Data":"94a2ae4de117b3ab762ba6a27f0f980ea30e29999d946fd4c2554c74dc7aa99f"} Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.276269 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-lkdzs" event={"ID":"5e40e563-9455-43dd-a3ef-e442010c31a4","Type":"ContainerStarted","Data":"303f682346fcac537eed898c92fb1d4c196558a5f0bce03b485a61b40dd86132"} Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.278314 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kbtpt" event={"ID":"d823b105-b073-44a4-9a1f-eb067b981295","Type":"ContainerStarted","Data":"aa9423b130066bb892e2e45c6698feab9d8dfb07320a7112bd0dbf071575aa81"} Oct 01 12:50:27 crc kubenswrapper[4727]: E1001 12:50:27.278853 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:acdeebaa51f962066f42f38b6c2d34a62fc6a24f58f9ee63d61b1e0cafbb29f8\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-lkdzs" podUID="5e40e563-9455-43dd-a3ef-e442010c31a4" Oct 01 12:50:27 crc kubenswrapper[4727]: E1001 12:50:27.284615 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kbtpt" podUID="d823b105-b073-44a4-9a1f-eb067b981295" Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.285255 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-wbl5k" event={"ID":"b1322ef4-b813-41a1-a851-d9e96e4cf7ef","Type":"ContainerStarted","Data":"25d47ec51e3d7c7d563ddbc72f27ce695d66698e9ab9393168d402359dc62a7d"} Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.286955 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-p4w7m" event={"ID":"21ca64fa-6683-4cac-97cd-32944d87bced","Type":"ContainerStarted","Data":"2ccb3753c5883a475653848bcafdbeb55f6d11b23c5f3fc701ca939d16ce0dc7"} Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.289481 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-x8pr2" event={"ID":"7c69585d-d708-4863-9cdf-bace662d6658","Type":"ContainerStarted","Data":"4b1daec896cc4f0c86c53cc7a41d960d43bf71dd54b523479cc5953820c954a3"} Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.289517 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-x8pr2" event={"ID":"7c69585d-d708-4863-9cdf-bace662d6658","Type":"ContainerStarted","Data":"b3ceae11a5edd711a6776fd502744ce8d96fda3709de0378fd2933c064f55f0b"} Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.290887 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-2mrqm" event={"ID":"71ad1cc3-a660-4a74-b15d-b1c7e03bf785","Type":"ContainerStarted","Data":"411705ea30407d9788c44d8fc6764a6b15e83d811512ba14c97d80ac4c44733c"} Oct 01 12:50:27 crc kubenswrapper[4727]: E1001 12:50:27.291575 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e1328760310f3bbf4548b8b1268cd711087dd91212b92bb0be287cad1f1b6fe9\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-x8pr2" podUID="7c69585d-d708-4863-9cdf-bace662d6658" Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.293737 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-7vqrx" event={"ID":"f31d7fb8-1ac0-4fd0-aa18-4cb9e879b506","Type":"ContainerStarted","Data":"3754e3b1f1d36e8bcbfbe2b1d6c8f3259a20c9bf65fb47844b9c6e53ff135ea0"} Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.293797 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-7vqrx" event={"ID":"f31d7fb8-1ac0-4fd0-aa18-4cb9e879b506","Type":"ContainerStarted","Data":"00c31423422cfa08ae924b0d365b5f9da0191ec0ba4f59c9faba2eec657f4ec6"} Oct 01 12:50:27 crc kubenswrapper[4727]: E1001 12:50:27.295474 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:15d7b5a365350a831ca59d984df67fadeccf89d599e487a7597b105afb82ce4a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-88c7-7vqrx" podUID="f31d7fb8-1ac0-4fd0-aa18-4cb9e879b506" Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.307940 4727 generic.go:334] "Generic (PLEG): container finished" podID="cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03" containerID="596c4d552438d4a610e0b1605bc342b25b0144660a8633e6d4c2c679a19677dd" exitCode=0 Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.308027 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-847n2" event={"ID":"cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03","Type":"ContainerDied","Data":"596c4d552438d4a610e0b1605bc342b25b0144660a8633e6d4c2c679a19677dd"} Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.311814 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-6l8fp" event={"ID":"7f874b80-31cc-4c3a-9506-999fb72deac5","Type":"ContainerStarted","Data":"af7eae56e472034ea58e62b87faa5a0000187c6b61b2d1568eac7d13d954c519"} Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.311853 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-6l8fp" event={"ID":"7f874b80-31cc-4c3a-9506-999fb72deac5","Type":"ContainerStarted","Data":"e26697f452e69139ffc66715ac83cda81a2091bd7a2f830b14d16cd63351f93f"} Oct 01 12:50:27 crc kubenswrapper[4727]: E1001 12:50:27.314366 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2\\\"\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-6l8fp" podUID="7f874b80-31cc-4c3a-9506-999fb72deac5" Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.323444 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-vgqst" event={"ID":"52bc77fe-21ba-4ac8-9fca-531e3c80432a","Type":"ContainerStarted","Data":"4403c82a58c2281b364582589a3c74384b209c8a50740f5565b4549f52f9224b"} Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.328131 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-czvw6" event={"ID":"43e69ea0-ecf5-40a9-ae20-94ac949ebfeb","Type":"ContainerStarted","Data":"e8762e50545194d8ed9ac6409752b2f62e57fc7f3d60953f18a18921e4a633e1"} Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.333166 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-kcsrk" event={"ID":"2325c2e9-2f53-48b4-8dfb-bc1089a0caab","Type":"ContainerStarted","Data":"24e24b48ccc69e298ffa524794c2eb62c230523a6a223e37970ddbde8244f337"} Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.334381 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-dt9z7" event={"ID":"18ea0de4-19a4-4417-a13e-bec65f0cfc31","Type":"ContainerStarted","Data":"25cef493e416e0870b6906fc41e7d9c3c3bb8338022783f3a843c143db19cbc5"} Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.343083 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-c88bk" event={"ID":"99ea0596-d1a9-434c-a176-0b4a244ecc83","Type":"ContainerStarted","Data":"aa607bfcec759a1ccfab357c243763f620ea8584454494adac21b7b7a5c0b52b"} Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.343124 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-c88bk" event={"ID":"99ea0596-d1a9-434c-a176-0b4a244ecc83","Type":"ContainerStarted","Data":"0bdfa741c94ebc50bc2538c7021e3c6cc70e2cb0e27c2b2d6dfdb9077d9db6fd"} Oct 01 12:50:27 crc kubenswrapper[4727]: E1001 12:50:27.346202 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3\\\"\"" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-c88bk" podUID="99ea0596-d1a9-434c-a176-0b4a244ecc83" Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.348412 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c9z78p" event={"ID":"4924da7d-07e9-4378-9965-c3e85c3018c8","Type":"ContainerStarted","Data":"bc32d6c1315852c61c01da5174051f368cbb905b148f62c03c407ffd970ce971"} Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.348458 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c9z78p" event={"ID":"4924da7d-07e9-4378-9965-c3e85c3018c8","Type":"ContainerStarted","Data":"35f99d4f71ed27dc308783224af526b769aa0dd5566cf3618578b097ba8f1188"} Oct 01 12:50:27 crc kubenswrapper[4727]: E1001 12:50:27.351299 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c9z78p" podUID="4924da7d-07e9-4378-9965-c3e85c3018c8" Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.359935 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-j9gmz" event={"ID":"8ed41f7a-f315-407e-b7a8-c5dc3fef764a","Type":"ContainerStarted","Data":"89da1e908c3b6630b85855dcf6db197e9fadbd8e17361c7fa7e98f9a52a5db32"} Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.361629 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-6zhw5" event={"ID":"47e3cb37-ce4b-4280-9863-ad6a95b1347c","Type":"ContainerStarted","Data":"a644c0a10adb1e49e5a3126190250754d9b87fad7e780bbea8dbd08b67c06ea6"} Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.366643 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5db568f97f-zfnfz" event={"ID":"d6bc6c09-3c9e-4de0-bf11-239a93867c74","Type":"ContainerStarted","Data":"77e6988ba94ecefede13eb74149bac9d0981691f6b8a51d5b257f7b6e1301e5f"} Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.366670 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5db568f97f-zfnfz" event={"ID":"d6bc6c09-3c9e-4de0-bf11-239a93867c74","Type":"ContainerStarted","Data":"9015817332cabbe6416bf7f9e935ae56300c0a82caed55909f275c1043642295"} Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.373818 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-k7cf5" event={"ID":"a5c6c947-8392-4385-9448-ca70c91635e6","Type":"ContainerStarted","Data":"711259e7e7932b9fced7c6858ce5a8c134f4193cb81e6400253cdbb34ed8b0ed"} Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.932205 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s2z7z" Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.932260 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s2z7z" Oct 01 12:50:27 crc kubenswrapper[4727]: I1001 12:50:27.973331 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s2z7z" Oct 01 12:50:28 crc kubenswrapper[4727]: E1001 12:50:28.384563 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3\\\"\"" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-c88bk" podUID="99ea0596-d1a9-434c-a176-0b4a244ecc83" Oct 01 12:50:28 crc kubenswrapper[4727]: E1001 12:50:28.384857 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e1328760310f3bbf4548b8b1268cd711087dd91212b92bb0be287cad1f1b6fe9\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-x8pr2" podUID="7c69585d-d708-4863-9cdf-bace662d6658" Oct 01 12:50:28 crc kubenswrapper[4727]: E1001 12:50:28.385040 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.18:5001/openstack-k8s-operators/telemetry-operator:bf75b048435f0292acc396dd25db2ce979264c1e\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-b5b89c9dd-6c9pp" podUID="cc1db3cf-e8c2-4209-9d01-bb825fb693d6" Oct 01 12:50:28 crc kubenswrapper[4727]: E1001 12:50:28.390337 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c9z78p" podUID="4924da7d-07e9-4378-9965-c3e85c3018c8" Oct 01 12:50:28 crc kubenswrapper[4727]: E1001 12:50:28.390499 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2\\\"\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-6l8fp" podUID="7f874b80-31cc-4c3a-9506-999fb72deac5" Oct 01 12:50:28 crc kubenswrapper[4727]: E1001 12:50:28.391139 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kbtpt" podUID="d823b105-b073-44a4-9a1f-eb067b981295" Oct 01 12:50:28 crc kubenswrapper[4727]: E1001 12:50:28.391254 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:15d7b5a365350a831ca59d984df67fadeccf89d599e487a7597b105afb82ce4a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-88c7-7vqrx" podUID="f31d7fb8-1ac0-4fd0-aa18-4cb9e879b506" Oct 01 12:50:28 crc kubenswrapper[4727]: E1001 12:50:28.391274 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3\\\"\"" pod="openstack-operators/test-operator-controller-manager-85777745bb-smzzs" podUID="cd54773a-d526-46e2-a6bd-703886de898c" Oct 01 12:50:28 crc kubenswrapper[4727]: I1001 12:50:28.447736 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s2z7z" Oct 01 12:50:28 crc kubenswrapper[4727]: E1001 12:50:28.538976 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:acdeebaa51f962066f42f38b6c2d34a62fc6a24f58f9ee63d61b1e0cafbb29f8\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-lkdzs" podUID="5e40e563-9455-43dd-a3ef-e442010c31a4" Oct 01 12:50:29 crc kubenswrapper[4727]: I1001 12:50:29.396054 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5db568f97f-zfnfz" event={"ID":"d6bc6c09-3c9e-4de0-bf11-239a93867c74","Type":"ContainerStarted","Data":"ad69d3403e0d192c92c71d2d538236d48a89ad681bcbe86ee8a184b8e563e268"} Oct 01 12:50:29 crc kubenswrapper[4727]: I1001 12:50:29.396464 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5db568f97f-zfnfz" Oct 01 12:50:29 crc kubenswrapper[4727]: I1001 12:50:29.418498 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s2z7z"] Oct 01 12:50:29 crc kubenswrapper[4727]: I1001 12:50:29.425704 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5db568f97f-zfnfz" podStartSLOduration=5.425688766 podStartE2EDuration="5.425688766s" podCreationTimestamp="2025-10-01 12:50:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:50:29.424288041 +0000 UTC m=+807.745642878" watchObservedRunningTime="2025-10-01 12:50:29.425688766 +0000 UTC m=+807.747043603" Oct 01 12:50:29 crc kubenswrapper[4727]: I1001 12:50:29.512573 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-847n2" Oct 01 12:50:29 crc kubenswrapper[4727]: I1001 12:50:29.676577 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03-utilities\") pod \"cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03\" (UID: \"cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03\") " Oct 01 12:50:29 crc kubenswrapper[4727]: I1001 12:50:29.677449 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03-utilities" (OuterVolumeSpecName: "utilities") pod "cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03" (UID: "cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:50:29 crc kubenswrapper[4727]: I1001 12:50:29.677532 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03-catalog-content\") pod \"cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03\" (UID: \"cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03\") " Oct 01 12:50:29 crc kubenswrapper[4727]: I1001 12:50:29.688115 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htnbp\" (UniqueName: \"kubernetes.io/projected/cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03-kube-api-access-htnbp\") pod \"cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03\" (UID: \"cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03\") " Oct 01 12:50:29 crc kubenswrapper[4727]: I1001 12:50:29.690618 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:50:29 crc kubenswrapper[4727]: I1001 12:50:29.696132 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03-kube-api-access-htnbp" (OuterVolumeSpecName: "kube-api-access-htnbp") pod "cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03" (UID: "cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03"). InnerVolumeSpecName "kube-api-access-htnbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:50:29 crc kubenswrapper[4727]: I1001 12:50:29.792316 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htnbp\" (UniqueName: \"kubernetes.io/projected/cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03-kube-api-access-htnbp\") on node \"crc\" DevicePath \"\"" Oct 01 12:50:29 crc kubenswrapper[4727]: I1001 12:50:29.799458 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03" (UID: "cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:50:29 crc kubenswrapper[4727]: I1001 12:50:29.893889 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:50:30 crc kubenswrapper[4727]: I1001 12:50:30.414685 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-847n2" event={"ID":"cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03","Type":"ContainerDied","Data":"ca7b29864b038fa133147bff92538b68f0235a6144149a4bac2b2723a22377c6"} Oct 01 12:50:30 crc kubenswrapper[4727]: I1001 12:50:30.414742 4727 scope.go:117] "RemoveContainer" containerID="596c4d552438d4a610e0b1605bc342b25b0144660a8633e6d4c2c679a19677dd" Oct 01 12:50:30 crc kubenswrapper[4727]: I1001 12:50:30.414780 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-847n2" Oct 01 12:50:30 crc kubenswrapper[4727]: I1001 12:50:30.415448 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s2z7z" podUID="6b8f22d2-30de-475f-8277-551a27dc6ce7" containerName="registry-server" containerID="cri-o://5ba009bf984cb867ab10b7b8db87fb9aeb13766a56b156fea719d1dd2c08fe87" gracePeriod=2 Oct 01 12:50:30 crc kubenswrapper[4727]: I1001 12:50:30.434397 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-847n2"] Oct 01 12:50:30 crc kubenswrapper[4727]: I1001 12:50:30.441514 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-847n2"] Oct 01 12:50:31 crc kubenswrapper[4727]: I1001 12:50:31.429058 4727 generic.go:334] "Generic (PLEG): container finished" podID="6b8f22d2-30de-475f-8277-551a27dc6ce7" containerID="5ba009bf984cb867ab10b7b8db87fb9aeb13766a56b156fea719d1dd2c08fe87" exitCode=0 Oct 01 12:50:31 crc kubenswrapper[4727]: I1001 12:50:31.429135 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2z7z" event={"ID":"6b8f22d2-30de-475f-8277-551a27dc6ce7","Type":"ContainerDied","Data":"5ba009bf984cb867ab10b7b8db87fb9aeb13766a56b156fea719d1dd2c08fe87"} Oct 01 12:50:32 crc kubenswrapper[4727]: I1001 12:50:32.395301 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03" path="/var/lib/kubelet/pods/cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03/volumes" Oct 01 12:50:35 crc kubenswrapper[4727]: I1001 12:50:35.772376 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5db568f97f-zfnfz" Oct 01 12:50:36 crc kubenswrapper[4727]: I1001 12:50:36.409851 4727 scope.go:117] "RemoveContainer" containerID="3379e90c7535b0165e8a001f03af8ea9c74f55cd6593f6f1992c92426aa4732e" Oct 01 12:50:36 crc kubenswrapper[4727]: I1001 12:50:36.479799 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2z7z" event={"ID":"6b8f22d2-30de-475f-8277-551a27dc6ce7","Type":"ContainerDied","Data":"241bb4125f567c1c8729f4219575a7927f3a89f7b53e9272eeb022fd7b503778"} Oct 01 12:50:36 crc kubenswrapper[4727]: I1001 12:50:36.479849 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="241bb4125f567c1c8729f4219575a7927f3a89f7b53e9272eeb022fd7b503778" Oct 01 12:50:36 crc kubenswrapper[4727]: I1001 12:50:36.482305 4727 scope.go:117] "RemoveContainer" containerID="f58fa227f43ebf454c35a495d9cc6cb0adddc41fcb35a41aa87d1dd531c9d09a" Oct 01 12:50:36 crc kubenswrapper[4727]: I1001 12:50:36.504397 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s2z7z" Oct 01 12:50:36 crc kubenswrapper[4727]: I1001 12:50:36.599447 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b8f22d2-30de-475f-8277-551a27dc6ce7-catalog-content\") pod \"6b8f22d2-30de-475f-8277-551a27dc6ce7\" (UID: \"6b8f22d2-30de-475f-8277-551a27dc6ce7\") " Oct 01 12:50:36 crc kubenswrapper[4727]: I1001 12:50:36.599774 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b8f22d2-30de-475f-8277-551a27dc6ce7-utilities\") pod \"6b8f22d2-30de-475f-8277-551a27dc6ce7\" (UID: \"6b8f22d2-30de-475f-8277-551a27dc6ce7\") " Oct 01 12:50:36 crc kubenswrapper[4727]: I1001 12:50:36.599817 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcx5h\" (UniqueName: \"kubernetes.io/projected/6b8f22d2-30de-475f-8277-551a27dc6ce7-kube-api-access-tcx5h\") pod \"6b8f22d2-30de-475f-8277-551a27dc6ce7\" (UID: \"6b8f22d2-30de-475f-8277-551a27dc6ce7\") " Oct 01 12:50:36 crc kubenswrapper[4727]: I1001 12:50:36.601814 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b8f22d2-30de-475f-8277-551a27dc6ce7-utilities" (OuterVolumeSpecName: "utilities") pod "6b8f22d2-30de-475f-8277-551a27dc6ce7" (UID: "6b8f22d2-30de-475f-8277-551a27dc6ce7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:50:36 crc kubenswrapper[4727]: I1001 12:50:36.606084 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b8f22d2-30de-475f-8277-551a27dc6ce7-kube-api-access-tcx5h" (OuterVolumeSpecName: "kube-api-access-tcx5h") pod "6b8f22d2-30de-475f-8277-551a27dc6ce7" (UID: "6b8f22d2-30de-475f-8277-551a27dc6ce7"). InnerVolumeSpecName "kube-api-access-tcx5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:50:36 crc kubenswrapper[4727]: I1001 12:50:36.615606 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b8f22d2-30de-475f-8277-551a27dc6ce7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b8f22d2-30de-475f-8277-551a27dc6ce7" (UID: "6b8f22d2-30de-475f-8277-551a27dc6ce7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:50:36 crc kubenswrapper[4727]: I1001 12:50:36.701890 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b8f22d2-30de-475f-8277-551a27dc6ce7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:50:36 crc kubenswrapper[4727]: I1001 12:50:36.701944 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b8f22d2-30de-475f-8277-551a27dc6ce7-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:50:36 crc kubenswrapper[4727]: I1001 12:50:36.701959 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcx5h\" (UniqueName: \"kubernetes.io/projected/6b8f22d2-30de-475f-8277-551a27dc6ce7-kube-api-access-tcx5h\") on node \"crc\" DevicePath \"\"" Oct 01 12:50:37 crc kubenswrapper[4727]: I1001 12:50:37.497202 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-dvddt" event={"ID":"dc459bd0-7d95-4fe6-981a-7afdb763efa8","Type":"ContainerStarted","Data":"bbce9594453b07351c3a70f751fa75d327db13b3315adc0d76d0c156aacf4929"} Oct 01 12:50:37 crc kubenswrapper[4727]: I1001 12:50:37.497586 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-dvddt" event={"ID":"dc459bd0-7d95-4fe6-981a-7afdb763efa8","Type":"ContainerStarted","Data":"5d8188c898497d324635d95d3021fd4e3d2591686aa5888e766bdf589bfb5f24"} Oct 01 12:50:37 crc kubenswrapper[4727]: I1001 12:50:37.497608 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-dvddt" Oct 01 12:50:37 crc kubenswrapper[4727]: I1001 12:50:37.501216 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-kcsrk" event={"ID":"2325c2e9-2f53-48b4-8dfb-bc1089a0caab","Type":"ContainerStarted","Data":"67aad2631734dc99df8f5343587c0ff4aa459d5bb20616d12dcd3d993c7ce7bc"} Oct 01 12:50:37 crc kubenswrapper[4727]: I1001 12:50:37.504234 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-czvw6" event={"ID":"43e69ea0-ecf5-40a9-ae20-94ac949ebfeb","Type":"ContainerStarted","Data":"e9c477903e2a60aede9822ee9310e5897595f5745b9ba85d76b39edc9f79f863"} Oct 01 12:50:37 crc kubenswrapper[4727]: I1001 12:50:37.517616 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-82kh4" event={"ID":"583d4e80-fb09-4853-8d80-9df371bf58e6","Type":"ContainerStarted","Data":"30c2d9a427802554ecd3853bcc324fc74d6ef893ed46ca49e3b9b939040faaca"} Oct 01 12:50:37 crc kubenswrapper[4727]: I1001 12:50:37.529794 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-j9gmz" event={"ID":"8ed41f7a-f315-407e-b7a8-c5dc3fef764a","Type":"ContainerStarted","Data":"28292cdd5919667a88eb8330a91989e033f79d06e409a646dc0430a3d5a6eefd"} Oct 01 12:50:37 crc kubenswrapper[4727]: I1001 12:50:37.541255 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-6zhw5" event={"ID":"47e3cb37-ce4b-4280-9863-ad6a95b1347c","Type":"ContainerStarted","Data":"4c82fc4d991a1c3b338ac41ff4f8a31397888bed45d9fded42953267162cc5f6"} Oct 01 12:50:37 crc kubenswrapper[4727]: I1001 12:50:37.566654 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-2mrqm" event={"ID":"71ad1cc3-a660-4a74-b15d-b1c7e03bf785","Type":"ContainerStarted","Data":"422ae56c53c37b33d9afded6e67f50a320d87c36ee9f422c48bc666655062abb"} Oct 01 12:50:37 crc kubenswrapper[4727]: I1001 12:50:37.566708 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-2mrqm" event={"ID":"71ad1cc3-a660-4a74-b15d-b1c7e03bf785","Type":"ContainerStarted","Data":"24687542078d542eb136f9b0dab7b7c0f4763ffe9fcb986d4185e3eb57898bea"} Oct 01 12:50:37 crc kubenswrapper[4727]: I1001 12:50:37.566765 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-2mrqm" Oct 01 12:50:37 crc kubenswrapper[4727]: I1001 12:50:37.582525 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-6m8j5" event={"ID":"4409e813-a7ba-440c-8ef3-22ecac8a1093","Type":"ContainerStarted","Data":"2d82daafadfd12993f3452582cf3ca0f2e4817c7c62651cf4d7c2c1dbd8dc79f"} Oct 01 12:50:37 crc kubenswrapper[4727]: I1001 12:50:37.590176 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-2mrqm" podStartSLOduration=4.542995121 podStartE2EDuration="14.590158515s" podCreationTimestamp="2025-10-01 12:50:23 +0000 UTC" firstStartedPulling="2025-10-01 12:50:26.435110408 +0000 UTC m=+804.756465245" lastFinishedPulling="2025-10-01 12:50:36.482273802 +0000 UTC m=+814.803628639" observedRunningTime="2025-10-01 12:50:37.58842795 +0000 UTC m=+815.909782817" watchObservedRunningTime="2025-10-01 12:50:37.590158515 +0000 UTC m=+815.911513362" Oct 01 12:50:37 crc kubenswrapper[4727]: I1001 12:50:37.591220 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-dvddt" podStartSLOduration=3.555862361 podStartE2EDuration="14.591209858s" podCreationTimestamp="2025-10-01 12:50:23 +0000 UTC" firstStartedPulling="2025-10-01 12:50:25.387332874 +0000 UTC m=+803.708687711" lastFinishedPulling="2025-10-01 12:50:36.422680371 +0000 UTC m=+814.744035208" observedRunningTime="2025-10-01 12:50:37.525380929 +0000 UTC m=+815.846735776" watchObservedRunningTime="2025-10-01 12:50:37.591209858 +0000 UTC m=+815.912564715" Oct 01 12:50:37 crc kubenswrapper[4727]: I1001 12:50:37.591616 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-wbl5k" event={"ID":"b1322ef4-b813-41a1-a851-d9e96e4cf7ef","Type":"ContainerStarted","Data":"633a50a70be74e9d0ff091595c3e7ed51f9f8c9a713f4d48b05689909e6e98bb"} Oct 01 12:50:37 crc kubenswrapper[4727]: I1001 12:50:37.601602 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-p4w7m" event={"ID":"21ca64fa-6683-4cac-97cd-32944d87bced","Type":"ContainerStarted","Data":"b4fbb856693e81147c051fd6f572f05bd96f83dcae8228301487d13ccce841fd"} Oct 01 12:50:37 crc kubenswrapper[4727]: I1001 12:50:37.607485 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-k7cf5" event={"ID":"a5c6c947-8392-4385-9448-ca70c91635e6","Type":"ContainerStarted","Data":"8143462fc23dcd9d02e6500c896e7471fcfa2ce2aa341d7e03af6c9aa37a0a0f"} Oct 01 12:50:37 crc kubenswrapper[4727]: I1001 12:50:37.643704 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-dt9z7" event={"ID":"18ea0de4-19a4-4417-a13e-bec65f0cfc31","Type":"ContainerStarted","Data":"6c47702fe7577f17f63fa4bcca1f43326602bac96b3b955c66737eb7b26eed07"} Oct 01 12:50:37 crc kubenswrapper[4727]: I1001 12:50:37.651399 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s2z7z" Oct 01 12:50:37 crc kubenswrapper[4727]: I1001 12:50:37.775747 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s2z7z"] Oct 01 12:50:37 crc kubenswrapper[4727]: I1001 12:50:37.780087 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s2z7z"] Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.381976 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b8f22d2-30de-475f-8277-551a27dc6ce7" path="/var/lib/kubelet/pods/6b8f22d2-30de-475f-8277-551a27dc6ce7/volumes" Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.660406 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-czvw6" event={"ID":"43e69ea0-ecf5-40a9-ae20-94ac949ebfeb","Type":"ContainerStarted","Data":"0b832d17f6e2c8c48346ccbca320dc9372cd520be0ee7830c995c1004e58cf10"} Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.660523 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-czvw6" Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.663135 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-wbl5k" event={"ID":"b1322ef4-b813-41a1-a851-d9e96e4cf7ef","Type":"ContainerStarted","Data":"910c749d22cc94ab312bafd3abf4be0ee5b0532a7777f73ce0fe703261850e0f"} Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.663196 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-wbl5k" Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.667109 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-p4w7m" event={"ID":"21ca64fa-6683-4cac-97cd-32944d87bced","Type":"ContainerStarted","Data":"7c03d600ce0c80e712f4861da04bda4b5ad6467f9902ad3a81e7f8ad59b58cd1"} Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.667226 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-p4w7m" Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.669591 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-j9gmz" event={"ID":"8ed41f7a-f315-407e-b7a8-c5dc3fef764a","Type":"ContainerStarted","Data":"119a285050fd8f5c90b56cb3aa4449b8828fc128681ee16c23557053aa505b50"} Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.669675 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-j9gmz" Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.672100 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-6zhw5" event={"ID":"47e3cb37-ce4b-4280-9863-ad6a95b1347c","Type":"ContainerStarted","Data":"7571539987e8b795f8ad639061f3b2cc58e5765361d4c6be167ef8066b7a2d0b"} Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.672207 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-6zhw5" Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.674483 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-k7cf5" event={"ID":"a5c6c947-8392-4385-9448-ca70c91635e6","Type":"ContainerStarted","Data":"e96e682e5d9b6cb8ebf08c5a419d61ef066c2d21c4fea6ad4815ef3c016d6afa"} Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.674598 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-k7cf5" Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.677241 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-dt9z7" event={"ID":"18ea0de4-19a4-4417-a13e-bec65f0cfc31","Type":"ContainerStarted","Data":"1838c3fb6612ca02ca40adf5e1a07f7dcd3b4366dc4bdb917af21f45551476a9"} Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.678169 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-dt9z7" Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.680927 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-vgqst" event={"ID":"52bc77fe-21ba-4ac8-9fca-531e3c80432a","Type":"ContainerStarted","Data":"f97652d2922888bdc98c7bc6d7269374bf14a699185fb7c9bb77c76e20f1d3b4"} Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.680975 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-vgqst" event={"ID":"52bc77fe-21ba-4ac8-9fca-531e3c80432a","Type":"ContainerStarted","Data":"29d9d445e561d76b60ba1843d867f54f36f9fcd32dad52ad9fd1ab89f60f0b6e"} Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.681042 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-vgqst" Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.684181 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-6m8j5" event={"ID":"4409e813-a7ba-440c-8ef3-22ecac8a1093","Type":"ContainerStarted","Data":"197d98b85e91a0da389c3b29891f2a635b4ba6b5e8960b54247f4caea4de622d"} Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.684276 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-6m8j5" Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.684566 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-czvw6" podStartSLOduration=4.709878466 podStartE2EDuration="14.684545059s" podCreationTimestamp="2025-10-01 12:50:24 +0000 UTC" firstStartedPulling="2025-10-01 12:50:26.448012908 +0000 UTC m=+804.769367745" lastFinishedPulling="2025-10-01 12:50:36.422679501 +0000 UTC m=+814.744034338" observedRunningTime="2025-10-01 12:50:38.682166243 +0000 UTC m=+817.003521110" watchObservedRunningTime="2025-10-01 12:50:38.684545059 +0000 UTC m=+817.005899916" Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.687250 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-kcsrk" event={"ID":"2325c2e9-2f53-48b4-8dfb-bc1089a0caab","Type":"ContainerStarted","Data":"d333060b57a1566f43466d2778e0a76de1d5e98dea9efb8d0601f754de893cb0"} Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.687404 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-kcsrk" Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.690179 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-82kh4" event={"ID":"583d4e80-fb09-4853-8d80-9df371bf58e6","Type":"ContainerStarted","Data":"ee8ffd799d9c2d0934722d29924b451dc1b924936f57bd87f4cc79ad9553d401"} Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.708710 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-p4w7m" podStartSLOduration=5.591891133 podStartE2EDuration="15.708683225s" podCreationTimestamp="2025-10-01 12:50:23 +0000 UTC" firstStartedPulling="2025-10-01 12:50:26.355140821 +0000 UTC m=+804.676495658" lastFinishedPulling="2025-10-01 12:50:36.471932903 +0000 UTC m=+814.793287750" observedRunningTime="2025-10-01 12:50:38.701361003 +0000 UTC m=+817.022715860" watchObservedRunningTime="2025-10-01 12:50:38.708683225 +0000 UTC m=+817.030038072" Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.720970 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-wbl5k" podStartSLOduration=4.699017142 podStartE2EDuration="14.720951055s" podCreationTimestamp="2025-10-01 12:50:24 +0000 UTC" firstStartedPulling="2025-10-01 12:50:26.449724422 +0000 UTC m=+804.771079259" lastFinishedPulling="2025-10-01 12:50:36.471658335 +0000 UTC m=+814.793013172" observedRunningTime="2025-10-01 12:50:38.720318144 +0000 UTC m=+817.041672991" watchObservedRunningTime="2025-10-01 12:50:38.720951055 +0000 UTC m=+817.042305892" Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.742299 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-k7cf5" podStartSLOduration=5.6115658360000005 podStartE2EDuration="15.742277641s" podCreationTimestamp="2025-10-01 12:50:23 +0000 UTC" firstStartedPulling="2025-10-01 12:50:26.352333452 +0000 UTC m=+804.673688289" lastFinishedPulling="2025-10-01 12:50:36.483045257 +0000 UTC m=+814.804400094" observedRunningTime="2025-10-01 12:50:38.736366974 +0000 UTC m=+817.057721821" watchObservedRunningTime="2025-10-01 12:50:38.742277641 +0000 UTC m=+817.063632498" Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.764947 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-j9gmz" podStartSLOduration=5.722416165 podStartE2EDuration="15.76492376s" podCreationTimestamp="2025-10-01 12:50:23 +0000 UTC" firstStartedPulling="2025-10-01 12:50:26.454813965 +0000 UTC m=+804.776168802" lastFinishedPulling="2025-10-01 12:50:36.49732156 +0000 UTC m=+814.818676397" observedRunningTime="2025-10-01 12:50:38.75988405 +0000 UTC m=+817.081238887" watchObservedRunningTime="2025-10-01 12:50:38.76492376 +0000 UTC m=+817.086278607" Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.775570 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-6zhw5" podStartSLOduration=5.728553339 podStartE2EDuration="15.775548207s" podCreationTimestamp="2025-10-01 12:50:23 +0000 UTC" firstStartedPulling="2025-10-01 12:50:26.435427979 +0000 UTC m=+804.756782816" lastFinishedPulling="2025-10-01 12:50:36.482422837 +0000 UTC m=+814.803777684" observedRunningTime="2025-10-01 12:50:38.77310639 +0000 UTC m=+817.094461237" watchObservedRunningTime="2025-10-01 12:50:38.775548207 +0000 UTC m=+817.096903044" Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.792355 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-vgqst" podStartSLOduration=5.760559466 podStartE2EDuration="15.7923304s" podCreationTimestamp="2025-10-01 12:50:23 +0000 UTC" firstStartedPulling="2025-10-01 12:50:26.450965572 +0000 UTC m=+804.772320409" lastFinishedPulling="2025-10-01 12:50:36.482736506 +0000 UTC m=+814.804091343" observedRunningTime="2025-10-01 12:50:38.789956684 +0000 UTC m=+817.111311531" watchObservedRunningTime="2025-10-01 12:50:38.7923304 +0000 UTC m=+817.113685237" Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.808479 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-dt9z7" podStartSLOduration=5.82469102 podStartE2EDuration="15.808458552s" podCreationTimestamp="2025-10-01 12:50:23 +0000 UTC" firstStartedPulling="2025-10-01 12:50:26.446015984 +0000 UTC m=+804.767370821" lastFinishedPulling="2025-10-01 12:50:36.429783506 +0000 UTC m=+814.751138353" observedRunningTime="2025-10-01 12:50:38.806243022 +0000 UTC m=+817.127597869" watchObservedRunningTime="2025-10-01 12:50:38.808458552 +0000 UTC m=+817.129813389" Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.829168 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-82kh4" podStartSLOduration=5.770927864 podStartE2EDuration="15.829145848s" podCreationTimestamp="2025-10-01 12:50:23 +0000 UTC" firstStartedPulling="2025-10-01 12:50:26.453643257 +0000 UTC m=+804.774998104" lastFinishedPulling="2025-10-01 12:50:36.511861251 +0000 UTC m=+814.833216088" observedRunningTime="2025-10-01 12:50:38.828175058 +0000 UTC m=+817.149529905" watchObservedRunningTime="2025-10-01 12:50:38.829145848 +0000 UTC m=+817.150500685" Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.846487 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-kcsrk" podStartSLOduration=5.029752859 podStartE2EDuration="14.846469058s" podCreationTimestamp="2025-10-01 12:50:24 +0000 UTC" firstStartedPulling="2025-10-01 12:50:26.681921262 +0000 UTC m=+805.003276099" lastFinishedPulling="2025-10-01 12:50:36.498637461 +0000 UTC m=+814.819992298" observedRunningTime="2025-10-01 12:50:38.844646721 +0000 UTC m=+817.166001558" watchObservedRunningTime="2025-10-01 12:50:38.846469058 +0000 UTC m=+817.167823895" Oct 01 12:50:38 crc kubenswrapper[4727]: I1001 12:50:38.864919 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-6m8j5" podStartSLOduration=5.021729376 podStartE2EDuration="15.864895423s" podCreationTimestamp="2025-10-01 12:50:23 +0000 UTC" firstStartedPulling="2025-10-01 12:50:25.654362229 +0000 UTC m=+803.975717066" lastFinishedPulling="2025-10-01 12:50:36.497528276 +0000 UTC m=+814.818883113" observedRunningTime="2025-10-01 12:50:38.859662507 +0000 UTC m=+817.181017384" watchObservedRunningTime="2025-10-01 12:50:38.864895423 +0000 UTC m=+817.186250260" Oct 01 12:50:39 crc kubenswrapper[4727]: I1001 12:50:39.697849 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-82kh4" Oct 01 12:50:41 crc kubenswrapper[4727]: I1001 12:50:41.717467 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c9z78p" event={"ID":"4924da7d-07e9-4378-9965-c3e85c3018c8","Type":"ContainerStarted","Data":"dc51576f198d959c3368dd12382f2f46a36990517e9c248e8285837fc60e665e"} Oct 01 12:50:41 crc kubenswrapper[4727]: I1001 12:50:41.718041 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c9z78p" Oct 01 12:50:41 crc kubenswrapper[4727]: I1001 12:50:41.722074 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-x8pr2" event={"ID":"7c69585d-d708-4863-9cdf-bace662d6658","Type":"ContainerStarted","Data":"7247da7c641da23411817db7c8d8394294533531ba2811022483e4934cc1994b"} Oct 01 12:50:41 crc kubenswrapper[4727]: I1001 12:50:41.722518 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-x8pr2" Oct 01 12:50:41 crc kubenswrapper[4727]: I1001 12:50:41.756194 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c9z78p" podStartSLOduration=3.3250978350000002 podStartE2EDuration="17.756168078s" podCreationTimestamp="2025-10-01 12:50:24 +0000 UTC" firstStartedPulling="2025-10-01 12:50:26.716742807 +0000 UTC m=+805.038097644" lastFinishedPulling="2025-10-01 12:50:41.14781305 +0000 UTC m=+819.469167887" observedRunningTime="2025-10-01 12:50:41.743782326 +0000 UTC m=+820.065137173" watchObservedRunningTime="2025-10-01 12:50:41.756168078 +0000 UTC m=+820.077522935" Oct 01 12:50:41 crc kubenswrapper[4727]: I1001 12:50:41.769973 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-x8pr2" podStartSLOduration=4.09493571 podStartE2EDuration="18.769955626s" podCreationTimestamp="2025-10-01 12:50:23 +0000 UTC" firstStartedPulling="2025-10-01 12:50:26.464606285 +0000 UTC m=+804.785961132" lastFinishedPulling="2025-10-01 12:50:41.139626201 +0000 UTC m=+819.460981048" observedRunningTime="2025-10-01 12:50:41.760090023 +0000 UTC m=+820.081444880" watchObservedRunningTime="2025-10-01 12:50:41.769955626 +0000 UTC m=+820.091310463" Oct 01 12:50:44 crc kubenswrapper[4727]: I1001 12:50:44.165788 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-dvddt" Oct 01 12:50:44 crc kubenswrapper[4727]: I1001 12:50:44.200469 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-6m8j5" Oct 01 12:50:44 crc kubenswrapper[4727]: I1001 12:50:44.255086 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-6zhw5" Oct 01 12:50:44 crc kubenswrapper[4727]: I1001 12:50:44.264577 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-k7cf5" Oct 01 12:50:44 crc kubenswrapper[4727]: I1001 12:50:44.365213 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-p4w7m" Oct 01 12:50:44 crc kubenswrapper[4727]: I1001 12:50:44.383306 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-j9gmz" Oct 01 12:50:44 crc kubenswrapper[4727]: I1001 12:50:44.418514 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-2mrqm" Oct 01 12:50:44 crc kubenswrapper[4727]: I1001 12:50:44.425150 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-82kh4" Oct 01 12:50:44 crc kubenswrapper[4727]: I1001 12:50:44.523251 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-vgqst" Oct 01 12:50:44 crc kubenswrapper[4727]: I1001 12:50:44.671967 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-dt9z7" Oct 01 12:50:44 crc kubenswrapper[4727]: I1001 12:50:44.732452 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-wbl5k" Oct 01 12:50:44 crc kubenswrapper[4727]: I1001 12:50:44.785642 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-czvw6" Oct 01 12:50:45 crc kubenswrapper[4727]: I1001 12:50:45.246253 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-kcsrk" Oct 01 12:50:47 crc kubenswrapper[4727]: I1001 12:50:47.768340 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-7vqrx" event={"ID":"f31d7fb8-1ac0-4fd0-aa18-4cb9e879b506","Type":"ContainerStarted","Data":"2995d2865f645ddb0d0cb60611dd4bc04c58a088aabe8991112544eb856d5898"} Oct 01 12:50:47 crc kubenswrapper[4727]: I1001 12:50:47.769240 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-88c7-7vqrx" Oct 01 12:50:47 crc kubenswrapper[4727]: I1001 12:50:47.770457 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kbtpt" event={"ID":"d823b105-b073-44a4-9a1f-eb067b981295","Type":"ContainerStarted","Data":"48d1cfaedc43dae3fbbbb862d3ea72f474de4efc1b2995cd0835362aed388995"} Oct 01 12:50:47 crc kubenswrapper[4727]: I1001 12:50:47.772912 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-c88bk" event={"ID":"99ea0596-d1a9-434c-a176-0b4a244ecc83","Type":"ContainerStarted","Data":"9f5f85f7925c34e191d5ce8b578f41d32fc442602e1bf062560cb5fe2d41830b"} Oct 01 12:50:47 crc kubenswrapper[4727]: I1001 12:50:47.773156 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-c88bk" Oct 01 12:50:47 crc kubenswrapper[4727]: I1001 12:50:47.774957 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-6l8fp" event={"ID":"7f874b80-31cc-4c3a-9506-999fb72deac5","Type":"ContainerStarted","Data":"d41ee1e8d242bb6f0937aea7c3763129c93e176adf9a27633913d3e51b09d61c"} Oct 01 12:50:47 crc kubenswrapper[4727]: I1001 12:50:47.775146 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-6l8fp" Oct 01 12:50:47 crc kubenswrapper[4727]: I1001 12:50:47.778780 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-smzzs" event={"ID":"cd54773a-d526-46e2-a6bd-703886de898c","Type":"ContainerStarted","Data":"af8e80c709718afd10a1dcb84d05f58cced10f80288f7a7ef56ef3df7a543691"} Oct 01 12:50:47 crc kubenswrapper[4727]: I1001 12:50:47.779144 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-85777745bb-smzzs" Oct 01 12:50:47 crc kubenswrapper[4727]: I1001 12:50:47.780928 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-lkdzs" event={"ID":"5e40e563-9455-43dd-a3ef-e442010c31a4","Type":"ContainerStarted","Data":"c4513bf639571683d43772046bd6116c912f897570a9d29d5f64b512c6620443"} Oct 01 12:50:47 crc kubenswrapper[4727]: I1001 12:50:47.782847 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b5b89c9dd-6c9pp" event={"ID":"cc1db3cf-e8c2-4209-9d01-bb825fb693d6","Type":"ContainerStarted","Data":"202ded72da10287ba2338dabe264fd28c0fc9d59684e9658ea1307fd5a10f098"} Oct 01 12:50:47 crc kubenswrapper[4727]: I1001 12:50:47.783047 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-b5b89c9dd-6c9pp" Oct 01 12:50:47 crc kubenswrapper[4727]: I1001 12:50:47.795156 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-88c7-7vqrx" podStartSLOduration=4.032287774 podStartE2EDuration="24.795133047s" podCreationTimestamp="2025-10-01 12:50:23 +0000 UTC" firstStartedPulling="2025-10-01 12:50:26.463867462 +0000 UTC m=+804.785222299" lastFinishedPulling="2025-10-01 12:50:47.226712705 +0000 UTC m=+825.548067572" observedRunningTime="2025-10-01 12:50:47.788326441 +0000 UTC m=+826.109681288" watchObservedRunningTime="2025-10-01 12:50:47.795133047 +0000 UTC m=+826.116487884" Oct 01 12:50:47 crc kubenswrapper[4727]: I1001 12:50:47.812532 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-6l8fp" podStartSLOduration=3.825587551 podStartE2EDuration="23.812512629s" podCreationTimestamp="2025-10-01 12:50:24 +0000 UTC" firstStartedPulling="2025-10-01 12:50:26.701801073 +0000 UTC m=+805.023155910" lastFinishedPulling="2025-10-01 12:50:46.688726151 +0000 UTC m=+825.010080988" observedRunningTime="2025-10-01 12:50:47.808029166 +0000 UTC m=+826.129384023" watchObservedRunningTime="2025-10-01 12:50:47.812512629 +0000 UTC m=+826.133867466" Oct 01 12:50:47 crc kubenswrapper[4727]: I1001 12:50:47.836230 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-b5b89c9dd-6c9pp" podStartSLOduration=3.311822015 podStartE2EDuration="23.836208321s" podCreationTimestamp="2025-10-01 12:50:24 +0000 UTC" firstStartedPulling="2025-10-01 12:50:26.714235008 +0000 UTC m=+805.035589845" lastFinishedPulling="2025-10-01 12:50:47.238621314 +0000 UTC m=+825.559976151" observedRunningTime="2025-10-01 12:50:47.829660153 +0000 UTC m=+826.151015020" watchObservedRunningTime="2025-10-01 12:50:47.836208321 +0000 UTC m=+826.157563168" Oct 01 12:50:47 crc kubenswrapper[4727]: I1001 12:50:47.851117 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-85777745bb-smzzs" podStartSLOduration=3.296687383 podStartE2EDuration="23.851097323s" podCreationTimestamp="2025-10-01 12:50:24 +0000 UTC" firstStartedPulling="2025-10-01 12:50:26.71557734 +0000 UTC m=+805.036932177" lastFinishedPulling="2025-10-01 12:50:47.26998728 +0000 UTC m=+825.591342117" observedRunningTime="2025-10-01 12:50:47.847226911 +0000 UTC m=+826.168581748" watchObservedRunningTime="2025-10-01 12:50:47.851097323 +0000 UTC m=+826.172452160" Oct 01 12:50:47 crc kubenswrapper[4727]: I1001 12:50:47.868301 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-kbtpt" podStartSLOduration=3.348302651 podStartE2EDuration="23.868283078s" podCreationTimestamp="2025-10-01 12:50:24 +0000 UTC" firstStartedPulling="2025-10-01 12:50:26.714765714 +0000 UTC m=+805.036120551" lastFinishedPulling="2025-10-01 12:50:47.234746141 +0000 UTC m=+825.556100978" observedRunningTime="2025-10-01 12:50:47.86737163 +0000 UTC m=+826.188726467" watchObservedRunningTime="2025-10-01 12:50:47.868283078 +0000 UTC m=+826.189637925" Oct 01 12:50:47 crc kubenswrapper[4727]: I1001 12:50:47.892239 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-c88bk" podStartSLOduration=4.668918688 podStartE2EDuration="24.892221828s" podCreationTimestamp="2025-10-01 12:50:23 +0000 UTC" firstStartedPulling="2025-10-01 12:50:26.465587366 +0000 UTC m=+804.786942203" lastFinishedPulling="2025-10-01 12:50:46.688890506 +0000 UTC m=+825.010245343" observedRunningTime="2025-10-01 12:50:47.887775017 +0000 UTC m=+826.209129844" watchObservedRunningTime="2025-10-01 12:50:47.892221828 +0000 UTC m=+826.213576665" Oct 01 12:50:47 crc kubenswrapper[4727]: I1001 12:50:47.909671 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-lkdzs" podStartSLOduration=4.686390584 podStartE2EDuration="24.909651812s" podCreationTimestamp="2025-10-01 12:50:23 +0000 UTC" firstStartedPulling="2025-10-01 12:50:26.465575766 +0000 UTC m=+804.786930603" lastFinishedPulling="2025-10-01 12:50:46.688836984 +0000 UTC m=+825.010191831" observedRunningTime="2025-10-01 12:50:47.906322976 +0000 UTC m=+826.227677813" watchObservedRunningTime="2025-10-01 12:50:47.909651812 +0000 UTC m=+826.231006639" Oct 01 12:50:49 crc kubenswrapper[4727]: I1001 12:50:49.130288 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zf6bl"] Oct 01 12:50:49 crc kubenswrapper[4727]: E1001 12:50:49.130609 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03" containerName="extract-content" Oct 01 12:50:49 crc kubenswrapper[4727]: I1001 12:50:49.130623 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03" containerName="extract-content" Oct 01 12:50:49 crc kubenswrapper[4727]: E1001 12:50:49.130648 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03" containerName="registry-server" Oct 01 12:50:49 crc kubenswrapper[4727]: I1001 12:50:49.130654 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03" containerName="registry-server" Oct 01 12:50:49 crc kubenswrapper[4727]: E1001 12:50:49.130676 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b8f22d2-30de-475f-8277-551a27dc6ce7" containerName="registry-server" Oct 01 12:50:49 crc kubenswrapper[4727]: I1001 12:50:49.130682 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b8f22d2-30de-475f-8277-551a27dc6ce7" containerName="registry-server" Oct 01 12:50:49 crc kubenswrapper[4727]: E1001 12:50:49.130700 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03" containerName="extract-utilities" Oct 01 12:50:49 crc kubenswrapper[4727]: I1001 12:50:49.130706 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03" containerName="extract-utilities" Oct 01 12:50:49 crc kubenswrapper[4727]: E1001 12:50:49.130719 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b8f22d2-30de-475f-8277-551a27dc6ce7" containerName="extract-utilities" Oct 01 12:50:49 crc kubenswrapper[4727]: I1001 12:50:49.130725 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b8f22d2-30de-475f-8277-551a27dc6ce7" containerName="extract-utilities" Oct 01 12:50:49 crc kubenswrapper[4727]: E1001 12:50:49.130736 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b8f22d2-30de-475f-8277-551a27dc6ce7" containerName="extract-content" Oct 01 12:50:49 crc kubenswrapper[4727]: I1001 12:50:49.130742 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b8f22d2-30de-475f-8277-551a27dc6ce7" containerName="extract-content" Oct 01 12:50:49 crc kubenswrapper[4727]: I1001 12:50:49.130869 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf82b2e3-4f8f-4446-b4d5-d0e6b0f8ec03" containerName="registry-server" Oct 01 12:50:49 crc kubenswrapper[4727]: I1001 12:50:49.130886 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b8f22d2-30de-475f-8277-551a27dc6ce7" containerName="registry-server" Oct 01 12:50:49 crc kubenswrapper[4727]: I1001 12:50:49.132760 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zf6bl" Oct 01 12:50:49 crc kubenswrapper[4727]: I1001 12:50:49.177960 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zf6bl"] Oct 01 12:50:49 crc kubenswrapper[4727]: I1001 12:50:49.286755 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10fa6698-b78c-490e-8253-858d7eafce23-catalog-content\") pod \"community-operators-zf6bl\" (UID: \"10fa6698-b78c-490e-8253-858d7eafce23\") " pod="openshift-marketplace/community-operators-zf6bl" Oct 01 12:50:49 crc kubenswrapper[4727]: I1001 12:50:49.287525 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd6c5\" (UniqueName: \"kubernetes.io/projected/10fa6698-b78c-490e-8253-858d7eafce23-kube-api-access-dd6c5\") pod \"community-operators-zf6bl\" (UID: \"10fa6698-b78c-490e-8253-858d7eafce23\") " pod="openshift-marketplace/community-operators-zf6bl" Oct 01 12:50:49 crc kubenswrapper[4727]: I1001 12:50:49.287649 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10fa6698-b78c-490e-8253-858d7eafce23-utilities\") pod \"community-operators-zf6bl\" (UID: \"10fa6698-b78c-490e-8253-858d7eafce23\") " pod="openshift-marketplace/community-operators-zf6bl" Oct 01 12:50:49 crc kubenswrapper[4727]: I1001 12:50:49.389352 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10fa6698-b78c-490e-8253-858d7eafce23-catalog-content\") pod \"community-operators-zf6bl\" (UID: \"10fa6698-b78c-490e-8253-858d7eafce23\") " pod="openshift-marketplace/community-operators-zf6bl" Oct 01 12:50:49 crc kubenswrapper[4727]: I1001 12:50:49.389438 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd6c5\" (UniqueName: \"kubernetes.io/projected/10fa6698-b78c-490e-8253-858d7eafce23-kube-api-access-dd6c5\") pod \"community-operators-zf6bl\" (UID: \"10fa6698-b78c-490e-8253-858d7eafce23\") " pod="openshift-marketplace/community-operators-zf6bl" Oct 01 12:50:49 crc kubenswrapper[4727]: I1001 12:50:49.389459 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10fa6698-b78c-490e-8253-858d7eafce23-utilities\") pod \"community-operators-zf6bl\" (UID: \"10fa6698-b78c-490e-8253-858d7eafce23\") " pod="openshift-marketplace/community-operators-zf6bl" Oct 01 12:50:49 crc kubenswrapper[4727]: I1001 12:50:49.391623 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10fa6698-b78c-490e-8253-858d7eafce23-catalog-content\") pod \"community-operators-zf6bl\" (UID: \"10fa6698-b78c-490e-8253-858d7eafce23\") " pod="openshift-marketplace/community-operators-zf6bl" Oct 01 12:50:49 crc kubenswrapper[4727]: I1001 12:50:49.392459 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10fa6698-b78c-490e-8253-858d7eafce23-utilities\") pod \"community-operators-zf6bl\" (UID: \"10fa6698-b78c-490e-8253-858d7eafce23\") " pod="openshift-marketplace/community-operators-zf6bl" Oct 01 12:50:49 crc kubenswrapper[4727]: I1001 12:50:49.429690 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd6c5\" (UniqueName: \"kubernetes.io/projected/10fa6698-b78c-490e-8253-858d7eafce23-kube-api-access-dd6c5\") pod \"community-operators-zf6bl\" (UID: \"10fa6698-b78c-490e-8253-858d7eafce23\") " pod="openshift-marketplace/community-operators-zf6bl" Oct 01 12:50:49 crc kubenswrapper[4727]: I1001 12:50:49.482131 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zf6bl" Oct 01 12:50:49 crc kubenswrapper[4727]: I1001 12:50:49.959100 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zf6bl"] Oct 01 12:50:49 crc kubenswrapper[4727]: W1001 12:50:49.965048 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10fa6698_b78c_490e_8253_858d7eafce23.slice/crio-9e8b74539fe12a0fa39f334f01455dc3818d92dbfdbf4474e31b52b4b1aeb4d9 WatchSource:0}: Error finding container 9e8b74539fe12a0fa39f334f01455dc3818d92dbfdbf4474e31b52b4b1aeb4d9: Status 404 returned error can't find the container with id 9e8b74539fe12a0fa39f334f01455dc3818d92dbfdbf4474e31b52b4b1aeb4d9 Oct 01 12:50:50 crc kubenswrapper[4727]: I1001 12:50:50.807905 4727 generic.go:334] "Generic (PLEG): container finished" podID="10fa6698-b78c-490e-8253-858d7eafce23" containerID="85e7b2906e724c9e2f58c875b3bb8287c9a94765b7149f68714859e375c18b84" exitCode=0 Oct 01 12:50:50 crc kubenswrapper[4727]: I1001 12:50:50.807948 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zf6bl" event={"ID":"10fa6698-b78c-490e-8253-858d7eafce23","Type":"ContainerDied","Data":"85e7b2906e724c9e2f58c875b3bb8287c9a94765b7149f68714859e375c18b84"} Oct 01 12:50:50 crc kubenswrapper[4727]: I1001 12:50:50.809210 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zf6bl" event={"ID":"10fa6698-b78c-490e-8253-858d7eafce23","Type":"ContainerStarted","Data":"9e8b74539fe12a0fa39f334f01455dc3818d92dbfdbf4474e31b52b4b1aeb4d9"} Oct 01 12:50:52 crc kubenswrapper[4727]: I1001 12:50:52.826496 4727 generic.go:334] "Generic (PLEG): container finished" podID="10fa6698-b78c-490e-8253-858d7eafce23" containerID="cb9a7a8bd67341fd9812ed9335b09c78e7a4ec6ae897c10bb30e9827026111be" exitCode=0 Oct 01 12:50:52 crc kubenswrapper[4727]: I1001 12:50:52.826580 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zf6bl" event={"ID":"10fa6698-b78c-490e-8253-858d7eafce23","Type":"ContainerDied","Data":"cb9a7a8bd67341fd9812ed9335b09c78e7a4ec6ae897c10bb30e9827026111be"} Oct 01 12:50:52 crc kubenswrapper[4727]: E1001 12:50:52.840128 4727 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10fa6698_b78c_490e_8253_858d7eafce23.slice/crio-conmon-cb9a7a8bd67341fd9812ed9335b09c78e7a4ec6ae897c10bb30e9827026111be.scope\": RecentStats: unable to find data in memory cache]" Oct 01 12:50:54 crc kubenswrapper[4727]: I1001 12:50:54.442131 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-88c7-7vqrx" Oct 01 12:50:54 crc kubenswrapper[4727]: I1001 12:50:54.537787 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-lkdzs" Oct 01 12:50:54 crc kubenswrapper[4727]: I1001 12:50:54.540502 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-lkdzs" Oct 01 12:50:54 crc kubenswrapper[4727]: I1001 12:50:54.597514 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-x8pr2" Oct 01 12:50:54 crc kubenswrapper[4727]: I1001 12:50:54.832253 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-6l8fp" Oct 01 12:50:54 crc kubenswrapper[4727]: I1001 12:50:54.835320 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-c88bk" Oct 01 12:50:54 crc kubenswrapper[4727]: I1001 12:50:54.853552 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-b5b89c9dd-6c9pp" Oct 01 12:50:54 crc kubenswrapper[4727]: I1001 12:50:54.977946 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-85777745bb-smzzs" Oct 01 12:50:55 crc kubenswrapper[4727]: I1001 12:50:55.265820 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c9z78p" Oct 01 12:51:06 crc kubenswrapper[4727]: I1001 12:51:06.950220 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zf6bl" event={"ID":"10fa6698-b78c-490e-8253-858d7eafce23","Type":"ContainerStarted","Data":"5653490d6abdb6627ecb9c1e115fe3c167594a2f710d7be1e9d2a19672ec027d"} Oct 01 12:51:07 crc kubenswrapper[4727]: I1001 12:51:07.988774 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zf6bl" podStartSLOduration=4.336532742 podStartE2EDuration="18.988731922s" podCreationTimestamp="2025-10-01 12:50:49 +0000 UTC" firstStartedPulling="2025-10-01 12:50:50.809453227 +0000 UTC m=+829.130808064" lastFinishedPulling="2025-10-01 12:51:05.461652407 +0000 UTC m=+843.783007244" observedRunningTime="2025-10-01 12:51:07.984414135 +0000 UTC m=+846.305768992" watchObservedRunningTime="2025-10-01 12:51:07.988731922 +0000 UTC m=+846.310086759" Oct 01 12:51:09 crc kubenswrapper[4727]: I1001 12:51:09.482864 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zf6bl" Oct 01 12:51:09 crc kubenswrapper[4727]: I1001 12:51:09.482917 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zf6bl" Oct 01 12:51:09 crc kubenswrapper[4727]: I1001 12:51:09.528730 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zf6bl" Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.383916 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-r95pp"] Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.385681 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-r95pp" Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.388593 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.388825 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-sfxx4" Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.389123 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.389300 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.414454 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-r95pp"] Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.483635 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tm7qq"] Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.484833 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-tm7qq" Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.487567 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.503593 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tm7qq"] Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.554848 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d1ca15-34cd-43d2-9cab-bb8a54b915ea-config\") pod \"dnsmasq-dns-675f4bcbfc-r95pp\" (UID: \"53d1ca15-34cd-43d2-9cab-bb8a54b915ea\") " pod="openstack/dnsmasq-dns-675f4bcbfc-r95pp" Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.554936 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ks4b\" (UniqueName: \"kubernetes.io/projected/53d1ca15-34cd-43d2-9cab-bb8a54b915ea-kube-api-access-6ks4b\") pod \"dnsmasq-dns-675f4bcbfc-r95pp\" (UID: \"53d1ca15-34cd-43d2-9cab-bb8a54b915ea\") " pod="openstack/dnsmasq-dns-675f4bcbfc-r95pp" Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.657053 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c74af31-9f4e-4082-85b7-c83442b3444f-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-tm7qq\" (UID: \"6c74af31-9f4e-4082-85b7-c83442b3444f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tm7qq" Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.658256 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dj99\" (UniqueName: \"kubernetes.io/projected/6c74af31-9f4e-4082-85b7-c83442b3444f-kube-api-access-7dj99\") pod \"dnsmasq-dns-78dd6ddcc-tm7qq\" (UID: \"6c74af31-9f4e-4082-85b7-c83442b3444f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tm7qq" Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.658376 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d1ca15-34cd-43d2-9cab-bb8a54b915ea-config\") pod \"dnsmasq-dns-675f4bcbfc-r95pp\" (UID: \"53d1ca15-34cd-43d2-9cab-bb8a54b915ea\") " pod="openstack/dnsmasq-dns-675f4bcbfc-r95pp" Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.659328 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d1ca15-34cd-43d2-9cab-bb8a54b915ea-config\") pod \"dnsmasq-dns-675f4bcbfc-r95pp\" (UID: \"53d1ca15-34cd-43d2-9cab-bb8a54b915ea\") " pod="openstack/dnsmasq-dns-675f4bcbfc-r95pp" Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.659460 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c74af31-9f4e-4082-85b7-c83442b3444f-config\") pod \"dnsmasq-dns-78dd6ddcc-tm7qq\" (UID: \"6c74af31-9f4e-4082-85b7-c83442b3444f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tm7qq" Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.659696 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ks4b\" (UniqueName: \"kubernetes.io/projected/53d1ca15-34cd-43d2-9cab-bb8a54b915ea-kube-api-access-6ks4b\") pod \"dnsmasq-dns-675f4bcbfc-r95pp\" (UID: \"53d1ca15-34cd-43d2-9cab-bb8a54b915ea\") " pod="openstack/dnsmasq-dns-675f4bcbfc-r95pp" Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.678851 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ks4b\" (UniqueName: \"kubernetes.io/projected/53d1ca15-34cd-43d2-9cab-bb8a54b915ea-kube-api-access-6ks4b\") pod \"dnsmasq-dns-675f4bcbfc-r95pp\" (UID: \"53d1ca15-34cd-43d2-9cab-bb8a54b915ea\") " pod="openstack/dnsmasq-dns-675f4bcbfc-r95pp" Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.711182 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-r95pp" Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.761269 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dj99\" (UniqueName: \"kubernetes.io/projected/6c74af31-9f4e-4082-85b7-c83442b3444f-kube-api-access-7dj99\") pod \"dnsmasq-dns-78dd6ddcc-tm7qq\" (UID: \"6c74af31-9f4e-4082-85b7-c83442b3444f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tm7qq" Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.761367 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c74af31-9f4e-4082-85b7-c83442b3444f-config\") pod \"dnsmasq-dns-78dd6ddcc-tm7qq\" (UID: \"6c74af31-9f4e-4082-85b7-c83442b3444f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tm7qq" Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.761455 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c74af31-9f4e-4082-85b7-c83442b3444f-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-tm7qq\" (UID: \"6c74af31-9f4e-4082-85b7-c83442b3444f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tm7qq" Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.762403 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c74af31-9f4e-4082-85b7-c83442b3444f-config\") pod \"dnsmasq-dns-78dd6ddcc-tm7qq\" (UID: \"6c74af31-9f4e-4082-85b7-c83442b3444f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tm7qq" Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.762537 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c74af31-9f4e-4082-85b7-c83442b3444f-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-tm7qq\" (UID: \"6c74af31-9f4e-4082-85b7-c83442b3444f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tm7qq" Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.781918 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dj99\" (UniqueName: \"kubernetes.io/projected/6c74af31-9f4e-4082-85b7-c83442b3444f-kube-api-access-7dj99\") pod \"dnsmasq-dns-78dd6ddcc-tm7qq\" (UID: \"6c74af31-9f4e-4082-85b7-c83442b3444f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tm7qq" Oct 01 12:51:13 crc kubenswrapper[4727]: I1001 12:51:13.812624 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-tm7qq" Oct 01 12:51:14 crc kubenswrapper[4727]: I1001 12:51:14.185675 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-r95pp"] Oct 01 12:51:14 crc kubenswrapper[4727]: W1001 12:51:14.188611 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53d1ca15_34cd_43d2_9cab_bb8a54b915ea.slice/crio-8517478e1339ecaa2e25c1c538a5f03b9ed31bb1151d9b43d74fd51b2a2edeb1 WatchSource:0}: Error finding container 8517478e1339ecaa2e25c1c538a5f03b9ed31bb1151d9b43d74fd51b2a2edeb1: Status 404 returned error can't find the container with id 8517478e1339ecaa2e25c1c538a5f03b9ed31bb1151d9b43d74fd51b2a2edeb1 Oct 01 12:51:14 crc kubenswrapper[4727]: I1001 12:51:14.278443 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tm7qq"] Oct 01 12:51:14 crc kubenswrapper[4727]: W1001 12:51:14.287611 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c74af31_9f4e_4082_85b7_c83442b3444f.slice/crio-72002d2216be89ceab8b52dfbfb014608f7fff8e790f711b6221a0b3c248aeff WatchSource:0}: Error finding container 72002d2216be89ceab8b52dfbfb014608f7fff8e790f711b6221a0b3c248aeff: Status 404 returned error can't find the container with id 72002d2216be89ceab8b52dfbfb014608f7fff8e790f711b6221a0b3c248aeff Oct 01 12:51:15 crc kubenswrapper[4727]: I1001 12:51:15.008375 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-r95pp" event={"ID":"53d1ca15-34cd-43d2-9cab-bb8a54b915ea","Type":"ContainerStarted","Data":"8517478e1339ecaa2e25c1c538a5f03b9ed31bb1151d9b43d74fd51b2a2edeb1"} Oct 01 12:51:15 crc kubenswrapper[4727]: I1001 12:51:15.009686 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-tm7qq" event={"ID":"6c74af31-9f4e-4082-85b7-c83442b3444f","Type":"ContainerStarted","Data":"72002d2216be89ceab8b52dfbfb014608f7fff8e790f711b6221a0b3c248aeff"} Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.067384 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-r95pp"] Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.107011 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-k4sr5"] Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.111608 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-k4sr5" Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.123035 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-k4sr5"] Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.214452 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef5d956e-c176-4586-83a3-177604f67404-dns-svc\") pod \"dnsmasq-dns-666b6646f7-k4sr5\" (UID: \"ef5d956e-c176-4586-83a3-177604f67404\") " pod="openstack/dnsmasq-dns-666b6646f7-k4sr5" Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.215020 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztn2c\" (UniqueName: \"kubernetes.io/projected/ef5d956e-c176-4586-83a3-177604f67404-kube-api-access-ztn2c\") pod \"dnsmasq-dns-666b6646f7-k4sr5\" (UID: \"ef5d956e-c176-4586-83a3-177604f67404\") " pod="openstack/dnsmasq-dns-666b6646f7-k4sr5" Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.215135 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef5d956e-c176-4586-83a3-177604f67404-config\") pod \"dnsmasq-dns-666b6646f7-k4sr5\" (UID: \"ef5d956e-c176-4586-83a3-177604f67404\") " pod="openstack/dnsmasq-dns-666b6646f7-k4sr5" Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.316775 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef5d956e-c176-4586-83a3-177604f67404-dns-svc\") pod \"dnsmasq-dns-666b6646f7-k4sr5\" (UID: \"ef5d956e-c176-4586-83a3-177604f67404\") " pod="openstack/dnsmasq-dns-666b6646f7-k4sr5" Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.316885 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztn2c\" (UniqueName: \"kubernetes.io/projected/ef5d956e-c176-4586-83a3-177604f67404-kube-api-access-ztn2c\") pod \"dnsmasq-dns-666b6646f7-k4sr5\" (UID: \"ef5d956e-c176-4586-83a3-177604f67404\") " pod="openstack/dnsmasq-dns-666b6646f7-k4sr5" Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.316919 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef5d956e-c176-4586-83a3-177604f67404-config\") pod \"dnsmasq-dns-666b6646f7-k4sr5\" (UID: \"ef5d956e-c176-4586-83a3-177604f67404\") " pod="openstack/dnsmasq-dns-666b6646f7-k4sr5" Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.318088 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef5d956e-c176-4586-83a3-177604f67404-config\") pod \"dnsmasq-dns-666b6646f7-k4sr5\" (UID: \"ef5d956e-c176-4586-83a3-177604f67404\") " pod="openstack/dnsmasq-dns-666b6646f7-k4sr5" Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.318698 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef5d956e-c176-4586-83a3-177604f67404-dns-svc\") pod \"dnsmasq-dns-666b6646f7-k4sr5\" (UID: \"ef5d956e-c176-4586-83a3-177604f67404\") " pod="openstack/dnsmasq-dns-666b6646f7-k4sr5" Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.354277 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztn2c\" (UniqueName: \"kubernetes.io/projected/ef5d956e-c176-4586-83a3-177604f67404-kube-api-access-ztn2c\") pod \"dnsmasq-dns-666b6646f7-k4sr5\" (UID: \"ef5d956e-c176-4586-83a3-177604f67404\") " pod="openstack/dnsmasq-dns-666b6646f7-k4sr5" Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.445072 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-k4sr5" Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.529970 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tm7qq"] Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.588131 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rqj8q"] Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.593640 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rqj8q" Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.597947 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rqj8q"] Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.647031 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4hrx\" (UniqueName: \"kubernetes.io/projected/dd17053d-9c00-4932-bace-9382d74d3094-kube-api-access-w4hrx\") pod \"dnsmasq-dns-57d769cc4f-rqj8q\" (UID: \"dd17053d-9c00-4932-bace-9382d74d3094\") " pod="openstack/dnsmasq-dns-57d769cc4f-rqj8q" Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.647107 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd17053d-9c00-4932-bace-9382d74d3094-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rqj8q\" (UID: \"dd17053d-9c00-4932-bace-9382d74d3094\") " pod="openstack/dnsmasq-dns-57d769cc4f-rqj8q" Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.647139 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd17053d-9c00-4932-bace-9382d74d3094-config\") pod \"dnsmasq-dns-57d769cc4f-rqj8q\" (UID: \"dd17053d-9c00-4932-bace-9382d74d3094\") " pod="openstack/dnsmasq-dns-57d769cc4f-rqj8q" Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.749045 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4hrx\" (UniqueName: \"kubernetes.io/projected/dd17053d-9c00-4932-bace-9382d74d3094-kube-api-access-w4hrx\") pod \"dnsmasq-dns-57d769cc4f-rqj8q\" (UID: \"dd17053d-9c00-4932-bace-9382d74d3094\") " pod="openstack/dnsmasq-dns-57d769cc4f-rqj8q" Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.749174 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd17053d-9c00-4932-bace-9382d74d3094-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rqj8q\" (UID: \"dd17053d-9c00-4932-bace-9382d74d3094\") " pod="openstack/dnsmasq-dns-57d769cc4f-rqj8q" Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.749215 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd17053d-9c00-4932-bace-9382d74d3094-config\") pod \"dnsmasq-dns-57d769cc4f-rqj8q\" (UID: \"dd17053d-9c00-4932-bace-9382d74d3094\") " pod="openstack/dnsmasq-dns-57d769cc4f-rqj8q" Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.751703 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd17053d-9c00-4932-bace-9382d74d3094-config\") pod \"dnsmasq-dns-57d769cc4f-rqj8q\" (UID: \"dd17053d-9c00-4932-bace-9382d74d3094\") " pod="openstack/dnsmasq-dns-57d769cc4f-rqj8q" Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.752994 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd17053d-9c00-4932-bace-9382d74d3094-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rqj8q\" (UID: \"dd17053d-9c00-4932-bace-9382d74d3094\") " pod="openstack/dnsmasq-dns-57d769cc4f-rqj8q" Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.775277 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4hrx\" (UniqueName: \"kubernetes.io/projected/dd17053d-9c00-4932-bace-9382d74d3094-kube-api-access-w4hrx\") pod \"dnsmasq-dns-57d769cc4f-rqj8q\" (UID: \"dd17053d-9c00-4932-bace-9382d74d3094\") " pod="openstack/dnsmasq-dns-57d769cc4f-rqj8q" Oct 01 12:51:16 crc kubenswrapper[4727]: I1001 12:51:16.928725 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rqj8q" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.033387 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-k4sr5"] Oct 01 12:51:17 crc kubenswrapper[4727]: W1001 12:51:17.071909 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef5d956e_c176_4586_83a3_177604f67404.slice/crio-0df384f47380ded3e30f9108c507dccdd07289e669bad1100ebeb8ee65ccd520 WatchSource:0}: Error finding container 0df384f47380ded3e30f9108c507dccdd07289e669bad1100ebeb8ee65ccd520: Status 404 returned error can't find the container with id 0df384f47380ded3e30f9108c507dccdd07289e669bad1100ebeb8ee65ccd520 Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.283395 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.284788 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.287296 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.287587 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.288815 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.290044 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.290089 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-765c7" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.290119 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.290179 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.304595 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.367737 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phpnd\" (UniqueName: \"kubernetes.io/projected/74ad068e-3c83-4fd2-af0a-7e45cd945411-kube-api-access-phpnd\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.367951 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/74ad068e-3c83-4fd2-af0a-7e45cd945411-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.368064 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74ad068e-3c83-4fd2-af0a-7e45cd945411-config-data\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.368093 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.368143 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/74ad068e-3c83-4fd2-af0a-7e45cd945411-server-conf\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.368188 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/74ad068e-3c83-4fd2-af0a-7e45cd945411-pod-info\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.368218 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/74ad068e-3c83-4fd2-af0a-7e45cd945411-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.368377 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/74ad068e-3c83-4fd2-af0a-7e45cd945411-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.368526 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/74ad068e-3c83-4fd2-af0a-7e45cd945411-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.368626 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/74ad068e-3c83-4fd2-af0a-7e45cd945411-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.368714 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/74ad068e-3c83-4fd2-af0a-7e45cd945411-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.470170 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/74ad068e-3c83-4fd2-af0a-7e45cd945411-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.470224 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/74ad068e-3c83-4fd2-af0a-7e45cd945411-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.470258 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/74ad068e-3c83-4fd2-af0a-7e45cd945411-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.470286 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phpnd\" (UniqueName: \"kubernetes.io/projected/74ad068e-3c83-4fd2-af0a-7e45cd945411-kube-api-access-phpnd\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.470320 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/74ad068e-3c83-4fd2-af0a-7e45cd945411-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.470446 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74ad068e-3c83-4fd2-af0a-7e45cd945411-config-data\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.470478 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.470495 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/74ad068e-3c83-4fd2-af0a-7e45cd945411-server-conf\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.470509 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/74ad068e-3c83-4fd2-af0a-7e45cd945411-pod-info\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.470529 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/74ad068e-3c83-4fd2-af0a-7e45cd945411-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.470546 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/74ad068e-3c83-4fd2-af0a-7e45cd945411-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.471740 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/74ad068e-3c83-4fd2-af0a-7e45cd945411-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.471801 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/74ad068e-3c83-4fd2-af0a-7e45cd945411-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.472359 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74ad068e-3c83-4fd2-af0a-7e45cd945411-config-data\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.472785 4727 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.474461 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/74ad068e-3c83-4fd2-af0a-7e45cd945411-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.475019 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/74ad068e-3c83-4fd2-af0a-7e45cd945411-server-conf\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.478974 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/74ad068e-3c83-4fd2-af0a-7e45cd945411-pod-info\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.480758 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/74ad068e-3c83-4fd2-af0a-7e45cd945411-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.486211 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rqj8q"] Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.489235 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/74ad068e-3c83-4fd2-af0a-7e45cd945411-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.499601 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phpnd\" (UniqueName: \"kubernetes.io/projected/74ad068e-3c83-4fd2-af0a-7e45cd945411-kube-api-access-phpnd\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.501294 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.504638 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/74ad068e-3c83-4fd2-af0a-7e45cd945411-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.612851 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.690123 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.692598 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.699767 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-f7tcr" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.700164 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.702600 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.702842 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.703304 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.703564 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.703653 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.710547 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.876845 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.876932 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxjwx\" (UniqueName: \"kubernetes.io/projected/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-kube-api-access-jxjwx\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.877096 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.877141 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.877214 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.877311 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.877343 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.877392 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.877455 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.877483 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.877517 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.979338 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.979412 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.979444 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.979486 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.979542 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.979569 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.979591 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.979621 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.979656 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxjwx\" (UniqueName: \"kubernetes.io/projected/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-kube-api-access-jxjwx\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.979687 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.979714 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.981480 4727 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.981875 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.981912 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.982192 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.982343 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.983346 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.987044 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.987819 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.991203 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:17 crc kubenswrapper[4727]: I1001 12:51:17.991945 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:18 crc kubenswrapper[4727]: I1001 12:51:18.004377 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxjwx\" (UniqueName: \"kubernetes.io/projected/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-kube-api-access-jxjwx\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:18 crc kubenswrapper[4727]: I1001 12:51:18.010941 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:18 crc kubenswrapper[4727]: I1001 12:51:18.057468 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rqj8q" event={"ID":"dd17053d-9c00-4932-bace-9382d74d3094","Type":"ContainerStarted","Data":"7b7cc177f083547256dff22d000eec32dd501dc7bea684efbde1747792f33056"} Oct 01 12:51:18 crc kubenswrapper[4727]: I1001 12:51:18.058983 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-k4sr5" event={"ID":"ef5d956e-c176-4586-83a3-177604f67404","Type":"ContainerStarted","Data":"0df384f47380ded3e30f9108c507dccdd07289e669bad1100ebeb8ee65ccd520"} Oct 01 12:51:18 crc kubenswrapper[4727]: I1001 12:51:18.079558 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:51:18 crc kubenswrapper[4727]: I1001 12:51:18.214959 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 12:51:18 crc kubenswrapper[4727]: I1001 12:51:18.613954 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 12:51:19 crc kubenswrapper[4727]: I1001 12:51:19.071706 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876","Type":"ContainerStarted","Data":"7507e5db8e1365746122c603727e5ff2587d837e674590d8264e66d621fb1866"} Oct 01 12:51:19 crc kubenswrapper[4727]: I1001 12:51:19.085500 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74ad068e-3c83-4fd2-af0a-7e45cd945411","Type":"ContainerStarted","Data":"88e8b697b422e778535380bfe388b3ac97fcfdf6805ac9932a8be3297366f19b"} Oct 01 12:51:19 crc kubenswrapper[4727]: I1001 12:51:19.563350 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zf6bl" Oct 01 12:51:19 crc kubenswrapper[4727]: I1001 12:51:19.647038 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zf6bl"] Oct 01 12:51:19 crc kubenswrapper[4727]: I1001 12:51:19.769491 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 01 12:51:19 crc kubenswrapper[4727]: I1001 12:51:19.771206 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 01 12:51:19 crc kubenswrapper[4727]: I1001 12:51:19.778050 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 01 12:51:19 crc kubenswrapper[4727]: I1001 12:51:19.778571 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-xt6kv" Oct 01 12:51:19 crc kubenswrapper[4727]: I1001 12:51:19.778756 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 01 12:51:19 crc kubenswrapper[4727]: I1001 12:51:19.778933 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 01 12:51:19 crc kubenswrapper[4727]: I1001 12:51:19.782067 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 01 12:51:19 crc kubenswrapper[4727]: I1001 12:51:19.786144 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 01 12:51:19 crc kubenswrapper[4727]: I1001 12:51:19.789641 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 01 12:51:19 crc kubenswrapper[4727]: I1001 12:51:19.917937 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d10e37bd-ab54-4798-bfa1-a94f2e13eba0-config-data-default\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") " pod="openstack/openstack-galera-0" Oct 01 12:51:19 crc kubenswrapper[4727]: I1001 12:51:19.918041 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d10e37bd-ab54-4798-bfa1-a94f2e13eba0-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") " pod="openstack/openstack-galera-0" Oct 01 12:51:19 crc kubenswrapper[4727]: I1001 12:51:19.918174 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d10e37bd-ab54-4798-bfa1-a94f2e13eba0-kolla-config\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") " pod="openstack/openstack-galera-0" Oct 01 12:51:19 crc kubenswrapper[4727]: I1001 12:51:19.918365 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d10e37bd-ab54-4798-bfa1-a94f2e13eba0-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") " pod="openstack/openstack-galera-0" Oct 01 12:51:19 crc kubenswrapper[4727]: I1001 12:51:19.918457 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d10e37bd-ab54-4798-bfa1-a94f2e13eba0-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") " pod="openstack/openstack-galera-0" Oct 01 12:51:19 crc kubenswrapper[4727]: I1001 12:51:19.918526 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") " pod="openstack/openstack-galera-0" Oct 01 12:51:19 crc kubenswrapper[4727]: I1001 12:51:19.918789 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d10e37bd-ab54-4798-bfa1-a94f2e13eba0-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") " pod="openstack/openstack-galera-0" Oct 01 12:51:19 crc kubenswrapper[4727]: I1001 12:51:19.918853 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bffv\" (UniqueName: \"kubernetes.io/projected/d10e37bd-ab54-4798-bfa1-a94f2e13eba0-kube-api-access-6bffv\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") " pod="openstack/openstack-galera-0" Oct 01 12:51:19 crc kubenswrapper[4727]: I1001 12:51:19.919057 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d10e37bd-ab54-4798-bfa1-a94f2e13eba0-secrets\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") " pod="openstack/openstack-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.022505 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d10e37bd-ab54-4798-bfa1-a94f2e13eba0-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") " pod="openstack/openstack-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.022588 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") " pod="openstack/openstack-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.022677 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d10e37bd-ab54-4798-bfa1-a94f2e13eba0-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") " pod="openstack/openstack-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.022702 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bffv\" (UniqueName: \"kubernetes.io/projected/d10e37bd-ab54-4798-bfa1-a94f2e13eba0-kube-api-access-6bffv\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") " pod="openstack/openstack-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.022730 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d10e37bd-ab54-4798-bfa1-a94f2e13eba0-secrets\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") " pod="openstack/openstack-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.022748 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d10e37bd-ab54-4798-bfa1-a94f2e13eba0-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") " pod="openstack/openstack-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.022767 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d10e37bd-ab54-4798-bfa1-a94f2e13eba0-config-data-default\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") " pod="openstack/openstack-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.022792 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d10e37bd-ab54-4798-bfa1-a94f2e13eba0-kolla-config\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") " pod="openstack/openstack-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.022834 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d10e37bd-ab54-4798-bfa1-a94f2e13eba0-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") " pod="openstack/openstack-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.023131 4727 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.024141 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d10e37bd-ab54-4798-bfa1-a94f2e13eba0-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") " pod="openstack/openstack-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.024261 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d10e37bd-ab54-4798-bfa1-a94f2e13eba0-config-data-default\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") " pod="openstack/openstack-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.024599 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d10e37bd-ab54-4798-bfa1-a94f2e13eba0-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") " pod="openstack/openstack-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.026619 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d10e37bd-ab54-4798-bfa1-a94f2e13eba0-kolla-config\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") " pod="openstack/openstack-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.036738 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d10e37bd-ab54-4798-bfa1-a94f2e13eba0-secrets\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") " pod="openstack/openstack-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.037150 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d10e37bd-ab54-4798-bfa1-a94f2e13eba0-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") " pod="openstack/openstack-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.055217 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bffv\" (UniqueName: \"kubernetes.io/projected/d10e37bd-ab54-4798-bfa1-a94f2e13eba0-kube-api-access-6bffv\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") " pod="openstack/openstack-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.060012 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d10e37bd-ab54-4798-bfa1-a94f2e13eba0-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") " pod="openstack/openstack-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.074100 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"d10e37bd-ab54-4798-bfa1-a94f2e13eba0\") " pod="openstack/openstack-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.111089 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zf6bl" podUID="10fa6698-b78c-490e-8253-858d7eafce23" containerName="registry-server" containerID="cri-o://5653490d6abdb6627ecb9c1e115fe3c167594a2f710d7be1e9d2a19672ec027d" gracePeriod=2 Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.115610 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.233922 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.235471 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.241706 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-b677g" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.241744 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.242361 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.242683 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.247716 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.331357 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.331403 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/01e2d457-092b-4b9d-a5fc-375a59758259-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.331426 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01e2d457-092b-4b9d-a5fc-375a59758259-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.331446 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/01e2d457-092b-4b9d-a5fc-375a59758259-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.332031 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e2d457-092b-4b9d-a5fc-375a59758259-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.332067 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/01e2d457-092b-4b9d-a5fc-375a59758259-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.332135 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ls5d\" (UniqueName: \"kubernetes.io/projected/01e2d457-092b-4b9d-a5fc-375a59758259-kube-api-access-2ls5d\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.332217 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/01e2d457-092b-4b9d-a5fc-375a59758259-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.335546 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/01e2d457-092b-4b9d-a5fc-375a59758259-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.437855 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/01e2d457-092b-4b9d-a5fc-375a59758259-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.437924 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/01e2d457-092b-4b9d-a5fc-375a59758259-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.438012 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.438068 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/01e2d457-092b-4b9d-a5fc-375a59758259-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.438102 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01e2d457-092b-4b9d-a5fc-375a59758259-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.438144 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/01e2d457-092b-4b9d-a5fc-375a59758259-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.438171 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e2d457-092b-4b9d-a5fc-375a59758259-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.438235 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/01e2d457-092b-4b9d-a5fc-375a59758259-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.438278 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ls5d\" (UniqueName: \"kubernetes.io/projected/01e2d457-092b-4b9d-a5fc-375a59758259-kube-api-access-2ls5d\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.440861 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/01e2d457-092b-4b9d-a5fc-375a59758259-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.441205 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/01e2d457-092b-4b9d-a5fc-375a59758259-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.441663 4727 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.447877 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01e2d457-092b-4b9d-a5fc-375a59758259-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.450049 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/01e2d457-092b-4b9d-a5fc-375a59758259-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.455473 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/01e2d457-092b-4b9d-a5fc-375a59758259-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.455808 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e2d457-092b-4b9d-a5fc-375a59758259-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.456259 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/01e2d457-092b-4b9d-a5fc-375a59758259-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.461048 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ls5d\" (UniqueName: \"kubernetes.io/projected/01e2d457-092b-4b9d-a5fc-375a59758259-kube-api-access-2ls5d\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.487953 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"01e2d457-092b-4b9d-a5fc-375a59758259\") " pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.530234 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.536216 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.544125 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-brkmm" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.544931 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.546136 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.560274 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.566786 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.642813 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/fca4f477-e812-4926-9935-8bfc1e2ca89a-memcached-tls-certs\") pod \"memcached-0\" (UID: \"fca4f477-e812-4926-9935-8bfc1e2ca89a\") " pod="openstack/memcached-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.642893 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fca4f477-e812-4926-9935-8bfc1e2ca89a-kolla-config\") pod \"memcached-0\" (UID: \"fca4f477-e812-4926-9935-8bfc1e2ca89a\") " pod="openstack/memcached-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.642935 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fca4f477-e812-4926-9935-8bfc1e2ca89a-combined-ca-bundle\") pod \"memcached-0\" (UID: \"fca4f477-e812-4926-9935-8bfc1e2ca89a\") " pod="openstack/memcached-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.642959 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfpg9\" (UniqueName: \"kubernetes.io/projected/fca4f477-e812-4926-9935-8bfc1e2ca89a-kube-api-access-zfpg9\") pod \"memcached-0\" (UID: \"fca4f477-e812-4926-9935-8bfc1e2ca89a\") " pod="openstack/memcached-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.643038 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fca4f477-e812-4926-9935-8bfc1e2ca89a-config-data\") pod \"memcached-0\" (UID: \"fca4f477-e812-4926-9935-8bfc1e2ca89a\") " pod="openstack/memcached-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.745557 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fca4f477-e812-4926-9935-8bfc1e2ca89a-kolla-config\") pod \"memcached-0\" (UID: \"fca4f477-e812-4926-9935-8bfc1e2ca89a\") " pod="openstack/memcached-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.745651 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fca4f477-e812-4926-9935-8bfc1e2ca89a-combined-ca-bundle\") pod \"memcached-0\" (UID: \"fca4f477-e812-4926-9935-8bfc1e2ca89a\") " pod="openstack/memcached-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.745692 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfpg9\" (UniqueName: \"kubernetes.io/projected/fca4f477-e812-4926-9935-8bfc1e2ca89a-kube-api-access-zfpg9\") pod \"memcached-0\" (UID: \"fca4f477-e812-4926-9935-8bfc1e2ca89a\") " pod="openstack/memcached-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.745797 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fca4f477-e812-4926-9935-8bfc1e2ca89a-config-data\") pod \"memcached-0\" (UID: \"fca4f477-e812-4926-9935-8bfc1e2ca89a\") " pod="openstack/memcached-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.745832 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/fca4f477-e812-4926-9935-8bfc1e2ca89a-memcached-tls-certs\") pod \"memcached-0\" (UID: \"fca4f477-e812-4926-9935-8bfc1e2ca89a\") " pod="openstack/memcached-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.748093 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fca4f477-e812-4926-9935-8bfc1e2ca89a-config-data\") pod \"memcached-0\" (UID: \"fca4f477-e812-4926-9935-8bfc1e2ca89a\") " pod="openstack/memcached-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.748559 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fca4f477-e812-4926-9935-8bfc1e2ca89a-kolla-config\") pod \"memcached-0\" (UID: \"fca4f477-e812-4926-9935-8bfc1e2ca89a\") " pod="openstack/memcached-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.753292 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fca4f477-e812-4926-9935-8bfc1e2ca89a-combined-ca-bundle\") pod \"memcached-0\" (UID: \"fca4f477-e812-4926-9935-8bfc1e2ca89a\") " pod="openstack/memcached-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.756061 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/fca4f477-e812-4926-9935-8bfc1e2ca89a-memcached-tls-certs\") pod \"memcached-0\" (UID: \"fca4f477-e812-4926-9935-8bfc1e2ca89a\") " pod="openstack/memcached-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.777540 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfpg9\" (UniqueName: \"kubernetes.io/projected/fca4f477-e812-4926-9935-8bfc1e2ca89a-kube-api-access-zfpg9\") pod \"memcached-0\" (UID: \"fca4f477-e812-4926-9935-8bfc1e2ca89a\") " pod="openstack/memcached-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.846368 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zf6bl" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.887803 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.953659 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10fa6698-b78c-490e-8253-858d7eafce23-catalog-content\") pod \"10fa6698-b78c-490e-8253-858d7eafce23\" (UID: \"10fa6698-b78c-490e-8253-858d7eafce23\") " Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.957017 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10fa6698-b78c-490e-8253-858d7eafce23-utilities\") pod \"10fa6698-b78c-490e-8253-858d7eafce23\" (UID: \"10fa6698-b78c-490e-8253-858d7eafce23\") " Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.957077 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd6c5\" (UniqueName: \"kubernetes.io/projected/10fa6698-b78c-490e-8253-858d7eafce23-kube-api-access-dd6c5\") pod \"10fa6698-b78c-490e-8253-858d7eafce23\" (UID: \"10fa6698-b78c-490e-8253-858d7eafce23\") " Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.959663 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10fa6698-b78c-490e-8253-858d7eafce23-utilities" (OuterVolumeSpecName: "utilities") pod "10fa6698-b78c-490e-8253-858d7eafce23" (UID: "10fa6698-b78c-490e-8253-858d7eafce23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:51:20 crc kubenswrapper[4727]: I1001 12:51:20.979195 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10fa6698-b78c-490e-8253-858d7eafce23-kube-api-access-dd6c5" (OuterVolumeSpecName: "kube-api-access-dd6c5") pod "10fa6698-b78c-490e-8253-858d7eafce23" (UID: "10fa6698-b78c-490e-8253-858d7eafce23"). InnerVolumeSpecName "kube-api-access-dd6c5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:51:21 crc kubenswrapper[4727]: I1001 12:51:21.045940 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10fa6698-b78c-490e-8253-858d7eafce23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10fa6698-b78c-490e-8253-858d7eafce23" (UID: "10fa6698-b78c-490e-8253-858d7eafce23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:51:21 crc kubenswrapper[4727]: I1001 12:51:21.060126 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10fa6698-b78c-490e-8253-858d7eafce23-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 12:51:21 crc kubenswrapper[4727]: I1001 12:51:21.060165 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10fa6698-b78c-490e-8253-858d7eafce23-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 12:51:21 crc kubenswrapper[4727]: I1001 12:51:21.060175 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dd6c5\" (UniqueName: \"kubernetes.io/projected/10fa6698-b78c-490e-8253-858d7eafce23-kube-api-access-dd6c5\") on node \"crc\" DevicePath \"\"" Oct 01 12:51:21 crc kubenswrapper[4727]: I1001 12:51:21.105117 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 01 12:51:21 crc kubenswrapper[4727]: I1001 12:51:21.171717 4727 generic.go:334] "Generic (PLEG): container finished" podID="10fa6698-b78c-490e-8253-858d7eafce23" containerID="5653490d6abdb6627ecb9c1e115fe3c167594a2f710d7be1e9d2a19672ec027d" exitCode=0 Oct 01 12:51:21 crc kubenswrapper[4727]: I1001 12:51:21.171763 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zf6bl" event={"ID":"10fa6698-b78c-490e-8253-858d7eafce23","Type":"ContainerDied","Data":"5653490d6abdb6627ecb9c1e115fe3c167594a2f710d7be1e9d2a19672ec027d"} Oct 01 12:51:21 crc kubenswrapper[4727]: I1001 12:51:21.171797 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zf6bl" event={"ID":"10fa6698-b78c-490e-8253-858d7eafce23","Type":"ContainerDied","Data":"9e8b74539fe12a0fa39f334f01455dc3818d92dbfdbf4474e31b52b4b1aeb4d9"} Oct 01 12:51:21 crc kubenswrapper[4727]: I1001 12:51:21.171818 4727 scope.go:117] "RemoveContainer" containerID="5653490d6abdb6627ecb9c1e115fe3c167594a2f710d7be1e9d2a19672ec027d" Oct 01 12:51:21 crc kubenswrapper[4727]: I1001 12:51:21.172120 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zf6bl" Oct 01 12:51:21 crc kubenswrapper[4727]: I1001 12:51:21.242331 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zf6bl"] Oct 01 12:51:21 crc kubenswrapper[4727]: I1001 12:51:21.245382 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zf6bl"] Oct 01 12:51:21 crc kubenswrapper[4727]: I1001 12:51:21.343794 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 01 12:51:21 crc kubenswrapper[4727]: I1001 12:51:21.353299 4727 scope.go:117] "RemoveContainer" containerID="cb9a7a8bd67341fd9812ed9335b09c78e7a4ec6ae897c10bb30e9827026111be" Oct 01 12:51:21 crc kubenswrapper[4727]: W1001 12:51:21.378009 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01e2d457_092b_4b9d_a5fc_375a59758259.slice/crio-ad6dd4c5f13763e08ca39d513fe5bd2bcb72024b3437afb13572b01939a0e97e WatchSource:0}: Error finding container ad6dd4c5f13763e08ca39d513fe5bd2bcb72024b3437afb13572b01939a0e97e: Status 404 returned error can't find the container with id ad6dd4c5f13763e08ca39d513fe5bd2bcb72024b3437afb13572b01939a0e97e Oct 01 12:51:21 crc kubenswrapper[4727]: I1001 12:51:21.479739 4727 scope.go:117] "RemoveContainer" containerID="85e7b2906e724c9e2f58c875b3bb8287c9a94765b7149f68714859e375c18b84" Oct 01 12:51:21 crc kubenswrapper[4727]: I1001 12:51:21.530641 4727 scope.go:117] "RemoveContainer" containerID="5653490d6abdb6627ecb9c1e115fe3c167594a2f710d7be1e9d2a19672ec027d" Oct 01 12:51:21 crc kubenswrapper[4727]: E1001 12:51:21.531086 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5653490d6abdb6627ecb9c1e115fe3c167594a2f710d7be1e9d2a19672ec027d\": container with ID starting with 5653490d6abdb6627ecb9c1e115fe3c167594a2f710d7be1e9d2a19672ec027d not found: ID does not exist" containerID="5653490d6abdb6627ecb9c1e115fe3c167594a2f710d7be1e9d2a19672ec027d" Oct 01 12:51:21 crc kubenswrapper[4727]: I1001 12:51:21.531128 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5653490d6abdb6627ecb9c1e115fe3c167594a2f710d7be1e9d2a19672ec027d"} err="failed to get container status \"5653490d6abdb6627ecb9c1e115fe3c167594a2f710d7be1e9d2a19672ec027d\": rpc error: code = NotFound desc = could not find container \"5653490d6abdb6627ecb9c1e115fe3c167594a2f710d7be1e9d2a19672ec027d\": container with ID starting with 5653490d6abdb6627ecb9c1e115fe3c167594a2f710d7be1e9d2a19672ec027d not found: ID does not exist" Oct 01 12:51:21 crc kubenswrapper[4727]: I1001 12:51:21.531154 4727 scope.go:117] "RemoveContainer" containerID="cb9a7a8bd67341fd9812ed9335b09c78e7a4ec6ae897c10bb30e9827026111be" Oct 01 12:51:21 crc kubenswrapper[4727]: E1001 12:51:21.531406 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb9a7a8bd67341fd9812ed9335b09c78e7a4ec6ae897c10bb30e9827026111be\": container with ID starting with cb9a7a8bd67341fd9812ed9335b09c78e7a4ec6ae897c10bb30e9827026111be not found: ID does not exist" containerID="cb9a7a8bd67341fd9812ed9335b09c78e7a4ec6ae897c10bb30e9827026111be" Oct 01 12:51:21 crc kubenswrapper[4727]: I1001 12:51:21.531455 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb9a7a8bd67341fd9812ed9335b09c78e7a4ec6ae897c10bb30e9827026111be"} err="failed to get container status \"cb9a7a8bd67341fd9812ed9335b09c78e7a4ec6ae897c10bb30e9827026111be\": rpc error: code = NotFound desc = could not find container \"cb9a7a8bd67341fd9812ed9335b09c78e7a4ec6ae897c10bb30e9827026111be\": container with ID starting with cb9a7a8bd67341fd9812ed9335b09c78e7a4ec6ae897c10bb30e9827026111be not found: ID does not exist" Oct 01 12:51:21 crc kubenswrapper[4727]: I1001 12:51:21.531530 4727 scope.go:117] "RemoveContainer" containerID="85e7b2906e724c9e2f58c875b3bb8287c9a94765b7149f68714859e375c18b84" Oct 01 12:51:21 crc kubenswrapper[4727]: E1001 12:51:21.531988 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85e7b2906e724c9e2f58c875b3bb8287c9a94765b7149f68714859e375c18b84\": container with ID starting with 85e7b2906e724c9e2f58c875b3bb8287c9a94765b7149f68714859e375c18b84 not found: ID does not exist" containerID="85e7b2906e724c9e2f58c875b3bb8287c9a94765b7149f68714859e375c18b84" Oct 01 12:51:21 crc kubenswrapper[4727]: I1001 12:51:21.532055 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85e7b2906e724c9e2f58c875b3bb8287c9a94765b7149f68714859e375c18b84"} err="failed to get container status \"85e7b2906e724c9e2f58c875b3bb8287c9a94765b7149f68714859e375c18b84\": rpc error: code = NotFound desc = could not find container \"85e7b2906e724c9e2f58c875b3bb8287c9a94765b7149f68714859e375c18b84\": container with ID starting with 85e7b2906e724c9e2f58c875b3bb8287c9a94765b7149f68714859e375c18b84 not found: ID does not exist" Oct 01 12:51:21 crc kubenswrapper[4727]: I1001 12:51:21.615345 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 01 12:51:21 crc kubenswrapper[4727]: W1001 12:51:21.642513 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfca4f477_e812_4926_9935_8bfc1e2ca89a.slice/crio-adf29c3f5392780ba5fcebd957f9ec11671e4eac4fc4aacebead1a1be70c1375 WatchSource:0}: Error finding container adf29c3f5392780ba5fcebd957f9ec11671e4eac4fc4aacebead1a1be70c1375: Status 404 returned error can't find the container with id adf29c3f5392780ba5fcebd957f9ec11671e4eac4fc4aacebead1a1be70c1375 Oct 01 12:51:22 crc kubenswrapper[4727]: I1001 12:51:22.226773 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d10e37bd-ab54-4798-bfa1-a94f2e13eba0","Type":"ContainerStarted","Data":"4f3284c658674b08fbec62c4efba623c014e07b4fbc4643fbf36cc7be9bb8665"} Oct 01 12:51:22 crc kubenswrapper[4727]: I1001 12:51:22.252351 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"fca4f477-e812-4926-9935-8bfc1e2ca89a","Type":"ContainerStarted","Data":"adf29c3f5392780ba5fcebd957f9ec11671e4eac4fc4aacebead1a1be70c1375"} Oct 01 12:51:22 crc kubenswrapper[4727]: I1001 12:51:22.265647 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"01e2d457-092b-4b9d-a5fc-375a59758259","Type":"ContainerStarted","Data":"ad6dd4c5f13763e08ca39d513fe5bd2bcb72024b3437afb13572b01939a0e97e"} Oct 01 12:51:22 crc kubenswrapper[4727]: I1001 12:51:22.316685 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 12:51:22 crc kubenswrapper[4727]: E1001 12:51:22.317171 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10fa6698-b78c-490e-8253-858d7eafce23" containerName="extract-content" Oct 01 12:51:22 crc kubenswrapper[4727]: I1001 12:51:22.317196 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="10fa6698-b78c-490e-8253-858d7eafce23" containerName="extract-content" Oct 01 12:51:22 crc kubenswrapper[4727]: E1001 12:51:22.317224 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10fa6698-b78c-490e-8253-858d7eafce23" containerName="extract-utilities" Oct 01 12:51:22 crc kubenswrapper[4727]: I1001 12:51:22.317233 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="10fa6698-b78c-490e-8253-858d7eafce23" containerName="extract-utilities" Oct 01 12:51:22 crc kubenswrapper[4727]: E1001 12:51:22.317266 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10fa6698-b78c-490e-8253-858d7eafce23" containerName="registry-server" Oct 01 12:51:22 crc kubenswrapper[4727]: I1001 12:51:22.317275 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="10fa6698-b78c-490e-8253-858d7eafce23" containerName="registry-server" Oct 01 12:51:22 crc kubenswrapper[4727]: I1001 12:51:22.317446 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="10fa6698-b78c-490e-8253-858d7eafce23" containerName="registry-server" Oct 01 12:51:22 crc kubenswrapper[4727]: I1001 12:51:22.318195 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 12:51:22 crc kubenswrapper[4727]: I1001 12:51:22.322383 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-l9wj4" Oct 01 12:51:22 crc kubenswrapper[4727]: I1001 12:51:22.328120 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 12:51:22 crc kubenswrapper[4727]: I1001 12:51:22.392934 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvnns\" (UniqueName: \"kubernetes.io/projected/4226303d-42f1-4267-ac66-5db1def22a4b-kube-api-access-rvnns\") pod \"kube-state-metrics-0\" (UID: \"4226303d-42f1-4267-ac66-5db1def22a4b\") " pod="openstack/kube-state-metrics-0" Oct 01 12:51:22 crc kubenswrapper[4727]: I1001 12:51:22.413077 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10fa6698-b78c-490e-8253-858d7eafce23" path="/var/lib/kubelet/pods/10fa6698-b78c-490e-8253-858d7eafce23/volumes" Oct 01 12:51:22 crc kubenswrapper[4727]: I1001 12:51:22.495100 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvnns\" (UniqueName: \"kubernetes.io/projected/4226303d-42f1-4267-ac66-5db1def22a4b-kube-api-access-rvnns\") pod \"kube-state-metrics-0\" (UID: \"4226303d-42f1-4267-ac66-5db1def22a4b\") " pod="openstack/kube-state-metrics-0" Oct 01 12:51:22 crc kubenswrapper[4727]: I1001 12:51:22.522139 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvnns\" (UniqueName: \"kubernetes.io/projected/4226303d-42f1-4267-ac66-5db1def22a4b-kube-api-access-rvnns\") pod \"kube-state-metrics-0\" (UID: \"4226303d-42f1-4267-ac66-5db1def22a4b\") " pod="openstack/kube-state-metrics-0" Oct 01 12:51:22 crc kubenswrapper[4727]: I1001 12:51:22.656425 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 12:51:25 crc kubenswrapper[4727]: I1001 12:51:25.826765 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 01 12:51:25 crc kubenswrapper[4727]: I1001 12:51:25.828515 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:25 crc kubenswrapper[4727]: I1001 12:51:25.832415 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 01 12:51:25 crc kubenswrapper[4727]: I1001 12:51:25.832464 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 01 12:51:25 crc kubenswrapper[4727]: I1001 12:51:25.832655 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 01 12:51:25 crc kubenswrapper[4727]: I1001 12:51:25.832840 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 01 12:51:25 crc kubenswrapper[4727]: I1001 12:51:25.835269 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-sp7j5" Oct 01 12:51:25 crc kubenswrapper[4727]: I1001 12:51:25.840040 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 01 12:51:25 crc kubenswrapper[4727]: I1001 12:51:25.970441 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-v56sx"] Oct 01 12:51:25 crc kubenswrapper[4727]: I1001 12:51:25.971391 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v56sx" Oct 01 12:51:25 crc kubenswrapper[4727]: I1001 12:51:25.975671 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 01 12:51:25 crc kubenswrapper[4727]: I1001 12:51:25.976167 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 01 12:51:25 crc kubenswrapper[4727]: I1001 12:51:25.981052 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-trst6" Oct 01 12:51:25 crc kubenswrapper[4727]: I1001 12:51:25.981861 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eab2ee0-0da5-4935-bb03-270c81efbe57-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0eab2ee0-0da5-4935-bb03-270c81efbe57\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:25 crc kubenswrapper[4727]: I1001 12:51:25.981931 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0eab2ee0-0da5-4935-bb03-270c81efbe57\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:25 crc kubenswrapper[4727]: I1001 12:51:25.982321 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0eab2ee0-0da5-4935-bb03-270c81efbe57-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0eab2ee0-0da5-4935-bb03-270c81efbe57\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:25 crc kubenswrapper[4727]: I1001 12:51:25.982421 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0eab2ee0-0da5-4935-bb03-270c81efbe57-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0eab2ee0-0da5-4935-bb03-270c81efbe57\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:25 crc kubenswrapper[4727]: I1001 12:51:25.982452 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eab2ee0-0da5-4935-bb03-270c81efbe57-config\") pod \"ovsdbserver-nb-0\" (UID: \"0eab2ee0-0da5-4935-bb03-270c81efbe57\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:25 crc kubenswrapper[4727]: I1001 12:51:25.982476 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eab2ee0-0da5-4935-bb03-270c81efbe57-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0eab2ee0-0da5-4935-bb03-270c81efbe57\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:25 crc kubenswrapper[4727]: I1001 12:51:25.982494 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqpqt\" (UniqueName: \"kubernetes.io/projected/0eab2ee0-0da5-4935-bb03-270c81efbe57-kube-api-access-pqpqt\") pod \"ovsdbserver-nb-0\" (UID: \"0eab2ee0-0da5-4935-bb03-270c81efbe57\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:25 crc kubenswrapper[4727]: I1001 12:51:25.982509 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eab2ee0-0da5-4935-bb03-270c81efbe57-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0eab2ee0-0da5-4935-bb03-270c81efbe57\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.014466 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-f9xhb"] Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.018508 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-f9xhb" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.035388 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v56sx"] Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.082086 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-f9xhb"] Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.083781 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0eab2ee0-0da5-4935-bb03-270c81efbe57-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0eab2ee0-0da5-4935-bb03-270c81efbe57\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.086041 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eab2ee0-0da5-4935-bb03-270c81efbe57-config\") pod \"ovsdbserver-nb-0\" (UID: \"0eab2ee0-0da5-4935-bb03-270c81efbe57\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.086077 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eab2ee0-0da5-4935-bb03-270c81efbe57-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0eab2ee0-0da5-4935-bb03-270c81efbe57\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.086121 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqpqt\" (UniqueName: \"kubernetes.io/projected/0eab2ee0-0da5-4935-bb03-270c81efbe57-kube-api-access-pqpqt\") pod \"ovsdbserver-nb-0\" (UID: \"0eab2ee0-0da5-4935-bb03-270c81efbe57\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.086145 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eab2ee0-0da5-4935-bb03-270c81efbe57-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0eab2ee0-0da5-4935-bb03-270c81efbe57\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.086203 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eab2ee0-0da5-4935-bb03-270c81efbe57-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0eab2ee0-0da5-4935-bb03-270c81efbe57\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.086232 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb0c554e-ed3f-4476-9963-dabc0089698d-scripts\") pod \"ovn-controller-v56sx\" (UID: \"fb0c554e-ed3f-4476-9963-dabc0089698d\") " pod="openstack/ovn-controller-v56sx" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.086281 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb0c554e-ed3f-4476-9963-dabc0089698d-ovn-controller-tls-certs\") pod \"ovn-controller-v56sx\" (UID: \"fb0c554e-ed3f-4476-9963-dabc0089698d\") " pod="openstack/ovn-controller-v56sx" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.086364 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb0c554e-ed3f-4476-9963-dabc0089698d-var-run-ovn\") pod \"ovn-controller-v56sx\" (UID: \"fb0c554e-ed3f-4476-9963-dabc0089698d\") " pod="openstack/ovn-controller-v56sx" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.086395 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0eab2ee0-0da5-4935-bb03-270c81efbe57\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.086448 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fb0c554e-ed3f-4476-9963-dabc0089698d-var-run\") pod \"ovn-controller-v56sx\" (UID: \"fb0c554e-ed3f-4476-9963-dabc0089698d\") " pod="openstack/ovn-controller-v56sx" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.086479 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0eab2ee0-0da5-4935-bb03-270c81efbe57-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0eab2ee0-0da5-4935-bb03-270c81efbe57\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.086524 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbflr\" (UniqueName: \"kubernetes.io/projected/fb0c554e-ed3f-4476-9963-dabc0089698d-kube-api-access-bbflr\") pod \"ovn-controller-v56sx\" (UID: \"fb0c554e-ed3f-4476-9963-dabc0089698d\") " pod="openstack/ovn-controller-v56sx" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.086576 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb0c554e-ed3f-4476-9963-dabc0089698d-combined-ca-bundle\") pod \"ovn-controller-v56sx\" (UID: \"fb0c554e-ed3f-4476-9963-dabc0089698d\") " pod="openstack/ovn-controller-v56sx" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.086596 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fb0c554e-ed3f-4476-9963-dabc0089698d-var-log-ovn\") pod \"ovn-controller-v56sx\" (UID: \"fb0c554e-ed3f-4476-9963-dabc0089698d\") " pod="openstack/ovn-controller-v56sx" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.086629 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0eab2ee0-0da5-4935-bb03-270c81efbe57-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0eab2ee0-0da5-4935-bb03-270c81efbe57\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.088969 4727 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0eab2ee0-0da5-4935-bb03-270c81efbe57\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.093426 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0eab2ee0-0da5-4935-bb03-270c81efbe57-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0eab2ee0-0da5-4935-bb03-270c81efbe57\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.098775 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eab2ee0-0da5-4935-bb03-270c81efbe57-config\") pod \"ovsdbserver-nb-0\" (UID: \"0eab2ee0-0da5-4935-bb03-270c81efbe57\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.121418 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0eab2ee0-0da5-4935-bb03-270c81efbe57\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.130529 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eab2ee0-0da5-4935-bb03-270c81efbe57-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0eab2ee0-0da5-4935-bb03-270c81efbe57\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.131547 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eab2ee0-0da5-4935-bb03-270c81efbe57-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0eab2ee0-0da5-4935-bb03-270c81efbe57\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.136814 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eab2ee0-0da5-4935-bb03-270c81efbe57-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0eab2ee0-0da5-4935-bb03-270c81efbe57\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.146718 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqpqt\" (UniqueName: \"kubernetes.io/projected/0eab2ee0-0da5-4935-bb03-270c81efbe57-kube-api-access-pqpqt\") pod \"ovsdbserver-nb-0\" (UID: \"0eab2ee0-0da5-4935-bb03-270c81efbe57\") " pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.165748 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.187542 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85f4f20e-6398-4386-a1b1-d34d0a4159b3-scripts\") pod \"ovn-controller-ovs-f9xhb\" (UID: \"85f4f20e-6398-4386-a1b1-d34d0a4159b3\") " pod="openstack/ovn-controller-ovs-f9xhb" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.187609 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb0c554e-ed3f-4476-9963-dabc0089698d-scripts\") pod \"ovn-controller-v56sx\" (UID: \"fb0c554e-ed3f-4476-9963-dabc0089698d\") " pod="openstack/ovn-controller-v56sx" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.187637 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb0c554e-ed3f-4476-9963-dabc0089698d-ovn-controller-tls-certs\") pod \"ovn-controller-v56sx\" (UID: \"fb0c554e-ed3f-4476-9963-dabc0089698d\") " pod="openstack/ovn-controller-v56sx" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.187660 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ff6r\" (UniqueName: \"kubernetes.io/projected/85f4f20e-6398-4386-a1b1-d34d0a4159b3-kube-api-access-4ff6r\") pod \"ovn-controller-ovs-f9xhb\" (UID: \"85f4f20e-6398-4386-a1b1-d34d0a4159b3\") " pod="openstack/ovn-controller-ovs-f9xhb" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.187696 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/85f4f20e-6398-4386-a1b1-d34d0a4159b3-var-run\") pod \"ovn-controller-ovs-f9xhb\" (UID: \"85f4f20e-6398-4386-a1b1-d34d0a4159b3\") " pod="openstack/ovn-controller-ovs-f9xhb" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.187716 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb0c554e-ed3f-4476-9963-dabc0089698d-var-run-ovn\") pod \"ovn-controller-v56sx\" (UID: \"fb0c554e-ed3f-4476-9963-dabc0089698d\") " pod="openstack/ovn-controller-v56sx" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.187735 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/85f4f20e-6398-4386-a1b1-d34d0a4159b3-var-lib\") pod \"ovn-controller-ovs-f9xhb\" (UID: \"85f4f20e-6398-4386-a1b1-d34d0a4159b3\") " pod="openstack/ovn-controller-ovs-f9xhb" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.187764 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fb0c554e-ed3f-4476-9963-dabc0089698d-var-run\") pod \"ovn-controller-v56sx\" (UID: \"fb0c554e-ed3f-4476-9963-dabc0089698d\") " pod="openstack/ovn-controller-v56sx" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.187799 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbflr\" (UniqueName: \"kubernetes.io/projected/fb0c554e-ed3f-4476-9963-dabc0089698d-kube-api-access-bbflr\") pod \"ovn-controller-v56sx\" (UID: \"fb0c554e-ed3f-4476-9963-dabc0089698d\") " pod="openstack/ovn-controller-v56sx" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.187827 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb0c554e-ed3f-4476-9963-dabc0089698d-combined-ca-bundle\") pod \"ovn-controller-v56sx\" (UID: \"fb0c554e-ed3f-4476-9963-dabc0089698d\") " pod="openstack/ovn-controller-v56sx" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.187844 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fb0c554e-ed3f-4476-9963-dabc0089698d-var-log-ovn\") pod \"ovn-controller-v56sx\" (UID: \"fb0c554e-ed3f-4476-9963-dabc0089698d\") " pod="openstack/ovn-controller-v56sx" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.187880 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/85f4f20e-6398-4386-a1b1-d34d0a4159b3-var-log\") pod \"ovn-controller-ovs-f9xhb\" (UID: \"85f4f20e-6398-4386-a1b1-d34d0a4159b3\") " pod="openstack/ovn-controller-ovs-f9xhb" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.187922 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/85f4f20e-6398-4386-a1b1-d34d0a4159b3-etc-ovs\") pod \"ovn-controller-ovs-f9xhb\" (UID: \"85f4f20e-6398-4386-a1b1-d34d0a4159b3\") " pod="openstack/ovn-controller-ovs-f9xhb" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.189909 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb0c554e-ed3f-4476-9963-dabc0089698d-var-run-ovn\") pod \"ovn-controller-v56sx\" (UID: \"fb0c554e-ed3f-4476-9963-dabc0089698d\") " pod="openstack/ovn-controller-v56sx" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.192546 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb0c554e-ed3f-4476-9963-dabc0089698d-scripts\") pod \"ovn-controller-v56sx\" (UID: \"fb0c554e-ed3f-4476-9963-dabc0089698d\") " pod="openstack/ovn-controller-v56sx" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.192714 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fb0c554e-ed3f-4476-9963-dabc0089698d-var-run\") pod \"ovn-controller-v56sx\" (UID: \"fb0c554e-ed3f-4476-9963-dabc0089698d\") " pod="openstack/ovn-controller-v56sx" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.193368 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fb0c554e-ed3f-4476-9963-dabc0089698d-var-log-ovn\") pod \"ovn-controller-v56sx\" (UID: \"fb0c554e-ed3f-4476-9963-dabc0089698d\") " pod="openstack/ovn-controller-v56sx" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.200869 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb0c554e-ed3f-4476-9963-dabc0089698d-ovn-controller-tls-certs\") pod \"ovn-controller-v56sx\" (UID: \"fb0c554e-ed3f-4476-9963-dabc0089698d\") " pod="openstack/ovn-controller-v56sx" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.201601 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb0c554e-ed3f-4476-9963-dabc0089698d-combined-ca-bundle\") pod \"ovn-controller-v56sx\" (UID: \"fb0c554e-ed3f-4476-9963-dabc0089698d\") " pod="openstack/ovn-controller-v56sx" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.224314 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbflr\" (UniqueName: \"kubernetes.io/projected/fb0c554e-ed3f-4476-9963-dabc0089698d-kube-api-access-bbflr\") pod \"ovn-controller-v56sx\" (UID: \"fb0c554e-ed3f-4476-9963-dabc0089698d\") " pod="openstack/ovn-controller-v56sx" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.293666 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/85f4f20e-6398-4386-a1b1-d34d0a4159b3-var-log\") pod \"ovn-controller-ovs-f9xhb\" (UID: \"85f4f20e-6398-4386-a1b1-d34d0a4159b3\") " pod="openstack/ovn-controller-ovs-f9xhb" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.293776 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/85f4f20e-6398-4386-a1b1-d34d0a4159b3-etc-ovs\") pod \"ovn-controller-ovs-f9xhb\" (UID: \"85f4f20e-6398-4386-a1b1-d34d0a4159b3\") " pod="openstack/ovn-controller-ovs-f9xhb" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.293956 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v56sx" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.293967 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85f4f20e-6398-4386-a1b1-d34d0a4159b3-scripts\") pod \"ovn-controller-ovs-f9xhb\" (UID: \"85f4f20e-6398-4386-a1b1-d34d0a4159b3\") " pod="openstack/ovn-controller-ovs-f9xhb" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.294303 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/85f4f20e-6398-4386-a1b1-d34d0a4159b3-etc-ovs\") pod \"ovn-controller-ovs-f9xhb\" (UID: \"85f4f20e-6398-4386-a1b1-d34d0a4159b3\") " pod="openstack/ovn-controller-ovs-f9xhb" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.294835 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ff6r\" (UniqueName: \"kubernetes.io/projected/85f4f20e-6398-4386-a1b1-d34d0a4159b3-kube-api-access-4ff6r\") pod \"ovn-controller-ovs-f9xhb\" (UID: \"85f4f20e-6398-4386-a1b1-d34d0a4159b3\") " pod="openstack/ovn-controller-ovs-f9xhb" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.294325 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/85f4f20e-6398-4386-a1b1-d34d0a4159b3-var-log\") pod \"ovn-controller-ovs-f9xhb\" (UID: \"85f4f20e-6398-4386-a1b1-d34d0a4159b3\") " pod="openstack/ovn-controller-ovs-f9xhb" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.294927 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/85f4f20e-6398-4386-a1b1-d34d0a4159b3-var-run\") pod \"ovn-controller-ovs-f9xhb\" (UID: \"85f4f20e-6398-4386-a1b1-d34d0a4159b3\") " pod="openstack/ovn-controller-ovs-f9xhb" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.294958 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/85f4f20e-6398-4386-a1b1-d34d0a4159b3-var-lib\") pod \"ovn-controller-ovs-f9xhb\" (UID: \"85f4f20e-6398-4386-a1b1-d34d0a4159b3\") " pod="openstack/ovn-controller-ovs-f9xhb" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.295046 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/85f4f20e-6398-4386-a1b1-d34d0a4159b3-var-run\") pod \"ovn-controller-ovs-f9xhb\" (UID: \"85f4f20e-6398-4386-a1b1-d34d0a4159b3\") " pod="openstack/ovn-controller-ovs-f9xhb" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.295232 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/85f4f20e-6398-4386-a1b1-d34d0a4159b3-var-lib\") pod \"ovn-controller-ovs-f9xhb\" (UID: \"85f4f20e-6398-4386-a1b1-d34d0a4159b3\") " pod="openstack/ovn-controller-ovs-f9xhb" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.296378 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85f4f20e-6398-4386-a1b1-d34d0a4159b3-scripts\") pod \"ovn-controller-ovs-f9xhb\" (UID: \"85f4f20e-6398-4386-a1b1-d34d0a4159b3\") " pod="openstack/ovn-controller-ovs-f9xhb" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.314583 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ff6r\" (UniqueName: \"kubernetes.io/projected/85f4f20e-6398-4386-a1b1-d34d0a4159b3-kube-api-access-4ff6r\") pod \"ovn-controller-ovs-f9xhb\" (UID: \"85f4f20e-6398-4386-a1b1-d34d0a4159b3\") " pod="openstack/ovn-controller-ovs-f9xhb" Oct 01 12:51:26 crc kubenswrapper[4727]: I1001 12:51:26.386806 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-f9xhb" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.622947 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.624883 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.630329 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.630566 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-mn5gt" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.630794 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.630974 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.640494 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.683015 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"02a48236-0ee4-40a0-b3eb-0f8f8de19b65\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.683101 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a48236-0ee4-40a0-b3eb-0f8f8de19b65-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"02a48236-0ee4-40a0-b3eb-0f8f8de19b65\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.683134 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/02a48236-0ee4-40a0-b3eb-0f8f8de19b65-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"02a48236-0ee4-40a0-b3eb-0f8f8de19b65\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.683153 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02a48236-0ee4-40a0-b3eb-0f8f8de19b65-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"02a48236-0ee4-40a0-b3eb-0f8f8de19b65\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.683327 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mczht\" (UniqueName: \"kubernetes.io/projected/02a48236-0ee4-40a0-b3eb-0f8f8de19b65-kube-api-access-mczht\") pod \"ovsdbserver-sb-0\" (UID: \"02a48236-0ee4-40a0-b3eb-0f8f8de19b65\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.683426 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/02a48236-0ee4-40a0-b3eb-0f8f8de19b65-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"02a48236-0ee4-40a0-b3eb-0f8f8de19b65\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.683457 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02a48236-0ee4-40a0-b3eb-0f8f8de19b65-config\") pod \"ovsdbserver-sb-0\" (UID: \"02a48236-0ee4-40a0-b3eb-0f8f8de19b65\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.683478 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/02a48236-0ee4-40a0-b3eb-0f8f8de19b65-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"02a48236-0ee4-40a0-b3eb-0f8f8de19b65\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.785889 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/02a48236-0ee4-40a0-b3eb-0f8f8de19b65-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"02a48236-0ee4-40a0-b3eb-0f8f8de19b65\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.785942 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02a48236-0ee4-40a0-b3eb-0f8f8de19b65-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"02a48236-0ee4-40a0-b3eb-0f8f8de19b65\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.786036 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mczht\" (UniqueName: \"kubernetes.io/projected/02a48236-0ee4-40a0-b3eb-0f8f8de19b65-kube-api-access-mczht\") pod \"ovsdbserver-sb-0\" (UID: \"02a48236-0ee4-40a0-b3eb-0f8f8de19b65\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.786120 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/02a48236-0ee4-40a0-b3eb-0f8f8de19b65-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"02a48236-0ee4-40a0-b3eb-0f8f8de19b65\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.786149 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02a48236-0ee4-40a0-b3eb-0f8f8de19b65-config\") pod \"ovsdbserver-sb-0\" (UID: \"02a48236-0ee4-40a0-b3eb-0f8f8de19b65\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.786177 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/02a48236-0ee4-40a0-b3eb-0f8f8de19b65-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"02a48236-0ee4-40a0-b3eb-0f8f8de19b65\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.786217 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"02a48236-0ee4-40a0-b3eb-0f8f8de19b65\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.786277 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a48236-0ee4-40a0-b3eb-0f8f8de19b65-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"02a48236-0ee4-40a0-b3eb-0f8f8de19b65\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.786928 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/02a48236-0ee4-40a0-b3eb-0f8f8de19b65-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"02a48236-0ee4-40a0-b3eb-0f8f8de19b65\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.786938 4727 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"02a48236-0ee4-40a0-b3eb-0f8f8de19b65\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.787087 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02a48236-0ee4-40a0-b3eb-0f8f8de19b65-config\") pod \"ovsdbserver-sb-0\" (UID: \"02a48236-0ee4-40a0-b3eb-0f8f8de19b65\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.787238 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02a48236-0ee4-40a0-b3eb-0f8f8de19b65-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"02a48236-0ee4-40a0-b3eb-0f8f8de19b65\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.793377 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a48236-0ee4-40a0-b3eb-0f8f8de19b65-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"02a48236-0ee4-40a0-b3eb-0f8f8de19b65\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.796807 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/02a48236-0ee4-40a0-b3eb-0f8f8de19b65-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"02a48236-0ee4-40a0-b3eb-0f8f8de19b65\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.804935 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mczht\" (UniqueName: \"kubernetes.io/projected/02a48236-0ee4-40a0-b3eb-0f8f8de19b65-kube-api-access-mczht\") pod \"ovsdbserver-sb-0\" (UID: \"02a48236-0ee4-40a0-b3eb-0f8f8de19b65\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.813570 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/02a48236-0ee4-40a0-b3eb-0f8f8de19b65-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"02a48236-0ee4-40a0-b3eb-0f8f8de19b65\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.822960 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"02a48236-0ee4-40a0-b3eb-0f8f8de19b65\") " pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:29 crc kubenswrapper[4727]: I1001 12:51:29.965330 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:36 crc kubenswrapper[4727]: E1001 12:51:36.229130 4727 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Oct 01 12:51:36 crc kubenswrapper[4727]: E1001 12:51:36.229842 4727 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jxjwx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(42c8d9a9-fa0f-44c5-9ac1-2361f24c0876): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 12:51:36 crc kubenswrapper[4727]: E1001 12:51:36.231134 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="42c8d9a9-fa0f-44c5-9ac1-2361f24c0876" Oct 01 12:51:36 crc kubenswrapper[4727]: E1001 12:51:36.382610 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="42c8d9a9-fa0f-44c5-9ac1-2361f24c0876" Oct 01 12:51:36 crc kubenswrapper[4727]: I1001 12:51:36.690995 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 01 12:51:41 crc kubenswrapper[4727]: E1001 12:51:41.688701 4727 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Oct 01 12:51:41 crc kubenswrapper[4727]: E1001 12:51:41.690477 4727 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:DB_ROOT_PASSWORD,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:DbRootPassword,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:secrets,ReadOnly:true,MountPath:/var/lib/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6bffv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(d10e37bd-ab54-4798-bfa1-a94f2e13eba0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 12:51:41 crc kubenswrapper[4727]: E1001 12:51:41.691968 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="d10e37bd-ab54-4798-bfa1-a94f2e13eba0" Oct 01 12:51:42 crc kubenswrapper[4727]: E1001 12:51:42.440668 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="d10e37bd-ab54-4798-bfa1-a94f2e13eba0" Oct 01 12:51:42 crc kubenswrapper[4727]: E1001 12:51:42.725576 4727 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 01 12:51:42 crc kubenswrapper[4727]: E1001 12:51:42.725770 4727 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7dj99,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-tm7qq_openstack(6c74af31-9f4e-4082-85b7-c83442b3444f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 12:51:42 crc kubenswrapper[4727]: E1001 12:51:42.727204 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-tm7qq" podUID="6c74af31-9f4e-4082-85b7-c83442b3444f" Oct 01 12:51:48 crc kubenswrapper[4727]: I1001 12:51:47.437075 4727 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 12:51:48 crc kubenswrapper[4727]: I1001 12:51:47.472459 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0eab2ee0-0da5-4935-bb03-270c81efbe57","Type":"ContainerStarted","Data":"be4fc48a1a593630b3d3a44e790cf45a543d92a9f62c4c5dd8a7483cd75466b7"} Oct 01 12:51:48 crc kubenswrapper[4727]: E1001 12:51:47.895502 4727 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 01 12:51:48 crc kubenswrapper[4727]: E1001 12:51:47.895683 4727 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ztn2c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-k4sr5_openstack(ef5d956e-c176-4586-83a3-177604f67404): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 12:51:48 crc kubenswrapper[4727]: E1001 12:51:47.896915 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-k4sr5" podUID="ef5d956e-c176-4586-83a3-177604f67404" Oct 01 12:51:48 crc kubenswrapper[4727]: E1001 12:51:47.917974 4727 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Oct 01 12:51:48 crc kubenswrapper[4727]: E1001 12:51:47.918301 4727 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n56bh8ch68h5f4h647hbbh85h675h587h4h99hc7hb4h5c4h6dh67dh74h599h675h696hbdh597h5ffh694h545h566hdfhdh66h7h648h5f9q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zfpg9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(fca4f477-e812-4926-9935-8bfc1e2ca89a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 12:51:48 crc kubenswrapper[4727]: E1001 12:51:47.919815 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="fca4f477-e812-4926-9935-8bfc1e2ca89a" Oct 01 12:51:48 crc kubenswrapper[4727]: E1001 12:51:48.379736 4727 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 01 12:51:48 crc kubenswrapper[4727]: E1001 12:51:48.380305 4727 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6ks4b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-r95pp_openstack(53d1ca15-34cd-43d2-9cab-bb8a54b915ea): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 12:51:48 crc kubenswrapper[4727]: E1001 12:51:48.382521 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-r95pp" podUID="53d1ca15-34cd-43d2-9cab-bb8a54b915ea" Oct 01 12:51:48 crc kubenswrapper[4727]: I1001 12:51:48.461510 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-tm7qq" Oct 01 12:51:48 crc kubenswrapper[4727]: I1001 12:51:48.496174 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-tm7qq" event={"ID":"6c74af31-9f4e-4082-85b7-c83442b3444f","Type":"ContainerDied","Data":"72002d2216be89ceab8b52dfbfb014608f7fff8e790f711b6221a0b3c248aeff"} Oct 01 12:51:48 crc kubenswrapper[4727]: I1001 12:51:48.496297 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-tm7qq" Oct 01 12:51:48 crc kubenswrapper[4727]: E1001 12:51:48.504075 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="fca4f477-e812-4926-9935-8bfc1e2ca89a" Oct 01 12:51:48 crc kubenswrapper[4727]: E1001 12:51:48.506193 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-k4sr5" podUID="ef5d956e-c176-4586-83a3-177604f67404" Oct 01 12:51:48 crc kubenswrapper[4727]: I1001 12:51:48.608631 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dj99\" (UniqueName: \"kubernetes.io/projected/6c74af31-9f4e-4082-85b7-c83442b3444f-kube-api-access-7dj99\") pod \"6c74af31-9f4e-4082-85b7-c83442b3444f\" (UID: \"6c74af31-9f4e-4082-85b7-c83442b3444f\") " Oct 01 12:51:48 crc kubenswrapper[4727]: I1001 12:51:48.608691 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c74af31-9f4e-4082-85b7-c83442b3444f-config\") pod \"6c74af31-9f4e-4082-85b7-c83442b3444f\" (UID: \"6c74af31-9f4e-4082-85b7-c83442b3444f\") " Oct 01 12:51:48 crc kubenswrapper[4727]: I1001 12:51:48.608779 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c74af31-9f4e-4082-85b7-c83442b3444f-dns-svc\") pod \"6c74af31-9f4e-4082-85b7-c83442b3444f\" (UID: \"6c74af31-9f4e-4082-85b7-c83442b3444f\") " Oct 01 12:51:48 crc kubenswrapper[4727]: I1001 12:51:48.609350 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c74af31-9f4e-4082-85b7-c83442b3444f-config" (OuterVolumeSpecName: "config") pod "6c74af31-9f4e-4082-85b7-c83442b3444f" (UID: "6c74af31-9f4e-4082-85b7-c83442b3444f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:51:48 crc kubenswrapper[4727]: I1001 12:51:48.612192 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c74af31-9f4e-4082-85b7-c83442b3444f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6c74af31-9f4e-4082-85b7-c83442b3444f" (UID: "6c74af31-9f4e-4082-85b7-c83442b3444f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:51:48 crc kubenswrapper[4727]: I1001 12:51:48.613562 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c74af31-9f4e-4082-85b7-c83442b3444f-kube-api-access-7dj99" (OuterVolumeSpecName: "kube-api-access-7dj99") pod "6c74af31-9f4e-4082-85b7-c83442b3444f" (UID: "6c74af31-9f4e-4082-85b7-c83442b3444f"). InnerVolumeSpecName "kube-api-access-7dj99". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:51:48 crc kubenswrapper[4727]: I1001 12:51:48.715300 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dj99\" (UniqueName: \"kubernetes.io/projected/6c74af31-9f4e-4082-85b7-c83442b3444f-kube-api-access-7dj99\") on node \"crc\" DevicePath \"\"" Oct 01 12:51:48 crc kubenswrapper[4727]: I1001 12:51:48.715761 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c74af31-9f4e-4082-85b7-c83442b3444f-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:51:48 crc kubenswrapper[4727]: I1001 12:51:48.715775 4727 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c74af31-9f4e-4082-85b7-c83442b3444f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:51:48 crc kubenswrapper[4727]: E1001 12:51:48.743991 4727 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 01 12:51:48 crc kubenswrapper[4727]: E1001 12:51:48.744179 4727 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w4hrx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-rqj8q_openstack(dd17053d-9c00-4932-bace-9382d74d3094): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 12:51:48 crc kubenswrapper[4727]: E1001 12:51:48.745723 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-rqj8q" podUID="dd17053d-9c00-4932-bace-9382d74d3094" Oct 01 12:51:48 crc kubenswrapper[4727]: I1001 12:51:48.786730 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-f9xhb"] Oct 01 12:51:48 crc kubenswrapper[4727]: W1001 12:51:48.793206 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85f4f20e_6398_4386_a1b1_d34d0a4159b3.slice/crio-248513be4bae608044986f786c0ec4a9e40ba33f10d01d418458714f78b0569f WatchSource:0}: Error finding container 248513be4bae608044986f786c0ec4a9e40ba33f10d01d418458714f78b0569f: Status 404 returned error can't find the container with id 248513be4bae608044986f786c0ec4a9e40ba33f10d01d418458714f78b0569f Oct 01 12:51:48 crc kubenswrapper[4727]: I1001 12:51:48.816547 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v56sx"] Oct 01 12:51:48 crc kubenswrapper[4727]: I1001 12:51:48.879183 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-r95pp" Oct 01 12:51:48 crc kubenswrapper[4727]: I1001 12:51:48.881706 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tm7qq"] Oct 01 12:51:48 crc kubenswrapper[4727]: I1001 12:51:48.887499 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tm7qq"] Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.035414 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d1ca15-34cd-43d2-9cab-bb8a54b915ea-config\") pod \"53d1ca15-34cd-43d2-9cab-bb8a54b915ea\" (UID: \"53d1ca15-34cd-43d2-9cab-bb8a54b915ea\") " Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.035496 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ks4b\" (UniqueName: \"kubernetes.io/projected/53d1ca15-34cd-43d2-9cab-bb8a54b915ea-kube-api-access-6ks4b\") pod \"53d1ca15-34cd-43d2-9cab-bb8a54b915ea\" (UID: \"53d1ca15-34cd-43d2-9cab-bb8a54b915ea\") " Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.042346 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53d1ca15-34cd-43d2-9cab-bb8a54b915ea-config" (OuterVolumeSpecName: "config") pod "53d1ca15-34cd-43d2-9cab-bb8a54b915ea" (UID: "53d1ca15-34cd-43d2-9cab-bb8a54b915ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.043081 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53d1ca15-34cd-43d2-9cab-bb8a54b915ea-kube-api-access-6ks4b" (OuterVolumeSpecName: "kube-api-access-6ks4b") pod "53d1ca15-34cd-43d2-9cab-bb8a54b915ea" (UID: "53d1ca15-34cd-43d2-9cab-bb8a54b915ea"). InnerVolumeSpecName "kube-api-access-6ks4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.139785 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d1ca15-34cd-43d2-9cab-bb8a54b915ea-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.139910 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ks4b\" (UniqueName: \"kubernetes.io/projected/53d1ca15-34cd-43d2-9cab-bb8a54b915ea-kube-api-access-6ks4b\") on node \"crc\" DevicePath \"\"" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.164570 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.311133 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.499750 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-fp98w"] Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.501289 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-fp98w" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.517256 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.532707 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-r95pp" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.533504 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-fp98w"] Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.533565 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-r95pp" event={"ID":"53d1ca15-34cd-43d2-9cab-bb8a54b915ea","Type":"ContainerDied","Data":"8517478e1339ecaa2e25c1c538a5f03b9ed31bb1151d9b43d74fd51b2a2edeb1"} Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.539588 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-f9xhb" event={"ID":"85f4f20e-6398-4386-a1b1-d34d0a4159b3","Type":"ContainerStarted","Data":"248513be4bae608044986f786c0ec4a9e40ba33f10d01d418458714f78b0569f"} Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.546390 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4226303d-42f1-4267-ac66-5db1def22a4b","Type":"ContainerStarted","Data":"a130e9643820d5742c5b4ae56bfff4d563dbe0fcec28cbead0e40bdb41cfedf7"} Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.562308 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v56sx" event={"ID":"fb0c554e-ed3f-4476-9963-dabc0089698d","Type":"ContainerStarted","Data":"444e6f860dddde6f976e755a5aca8472bd2d347d26c3db2d8f51101409eaa1b4"} Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.576311 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"01e2d457-092b-4b9d-a5fc-375a59758259","Type":"ContainerStarted","Data":"d176cdd886f8fec4a2c323837ab84096c245b6885275129b578549a76e8ab25c"} Oct 01 12:51:49 crc kubenswrapper[4727]: E1001 12:51:49.581468 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-rqj8q" podUID="dd17053d-9c00-4932-bace-9382d74d3094" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.662421 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d352723-6895-41f9-9ed5-7a90cb94dad6-combined-ca-bundle\") pod \"ovn-controller-metrics-fp98w\" (UID: \"3d352723-6895-41f9-9ed5-7a90cb94dad6\") " pod="openstack/ovn-controller-metrics-fp98w" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.662473 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3d352723-6895-41f9-9ed5-7a90cb94dad6-ovs-rundir\") pod \"ovn-controller-metrics-fp98w\" (UID: \"3d352723-6895-41f9-9ed5-7a90cb94dad6\") " pod="openstack/ovn-controller-metrics-fp98w" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.662524 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3d352723-6895-41f9-9ed5-7a90cb94dad6-ovn-rundir\") pod \"ovn-controller-metrics-fp98w\" (UID: \"3d352723-6895-41f9-9ed5-7a90cb94dad6\") " pod="openstack/ovn-controller-metrics-fp98w" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.662549 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75xmv\" (UniqueName: \"kubernetes.io/projected/3d352723-6895-41f9-9ed5-7a90cb94dad6-kube-api-access-75xmv\") pod \"ovn-controller-metrics-fp98w\" (UID: \"3d352723-6895-41f9-9ed5-7a90cb94dad6\") " pod="openstack/ovn-controller-metrics-fp98w" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.662580 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d352723-6895-41f9-9ed5-7a90cb94dad6-config\") pod \"ovn-controller-metrics-fp98w\" (UID: \"3d352723-6895-41f9-9ed5-7a90cb94dad6\") " pod="openstack/ovn-controller-metrics-fp98w" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.662616 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d352723-6895-41f9-9ed5-7a90cb94dad6-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-fp98w\" (UID: \"3d352723-6895-41f9-9ed5-7a90cb94dad6\") " pod="openstack/ovn-controller-metrics-fp98w" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.686951 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-k4sr5"] Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.710254 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-r95pp"] Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.737246 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-r95pp"] Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.747304 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-dwvj2"] Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.749412 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-dwvj2" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.755613 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.757526 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-dwvj2"] Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.764569 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d352723-6895-41f9-9ed5-7a90cb94dad6-combined-ca-bundle\") pod \"ovn-controller-metrics-fp98w\" (UID: \"3d352723-6895-41f9-9ed5-7a90cb94dad6\") " pod="openstack/ovn-controller-metrics-fp98w" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.764658 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3d352723-6895-41f9-9ed5-7a90cb94dad6-ovs-rundir\") pod \"ovn-controller-metrics-fp98w\" (UID: \"3d352723-6895-41f9-9ed5-7a90cb94dad6\") " pod="openstack/ovn-controller-metrics-fp98w" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.764739 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3d352723-6895-41f9-9ed5-7a90cb94dad6-ovn-rundir\") pod \"ovn-controller-metrics-fp98w\" (UID: \"3d352723-6895-41f9-9ed5-7a90cb94dad6\") " pod="openstack/ovn-controller-metrics-fp98w" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.764775 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75xmv\" (UniqueName: \"kubernetes.io/projected/3d352723-6895-41f9-9ed5-7a90cb94dad6-kube-api-access-75xmv\") pod \"ovn-controller-metrics-fp98w\" (UID: \"3d352723-6895-41f9-9ed5-7a90cb94dad6\") " pod="openstack/ovn-controller-metrics-fp98w" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.764838 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d352723-6895-41f9-9ed5-7a90cb94dad6-config\") pod \"ovn-controller-metrics-fp98w\" (UID: \"3d352723-6895-41f9-9ed5-7a90cb94dad6\") " pod="openstack/ovn-controller-metrics-fp98w" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.764948 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d352723-6895-41f9-9ed5-7a90cb94dad6-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-fp98w\" (UID: \"3d352723-6895-41f9-9ed5-7a90cb94dad6\") " pod="openstack/ovn-controller-metrics-fp98w" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.767925 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3d352723-6895-41f9-9ed5-7a90cb94dad6-ovs-rundir\") pod \"ovn-controller-metrics-fp98w\" (UID: \"3d352723-6895-41f9-9ed5-7a90cb94dad6\") " pod="openstack/ovn-controller-metrics-fp98w" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.769416 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3d352723-6895-41f9-9ed5-7a90cb94dad6-ovn-rundir\") pod \"ovn-controller-metrics-fp98w\" (UID: \"3d352723-6895-41f9-9ed5-7a90cb94dad6\") " pod="openstack/ovn-controller-metrics-fp98w" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.771447 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d352723-6895-41f9-9ed5-7a90cb94dad6-config\") pod \"ovn-controller-metrics-fp98w\" (UID: \"3d352723-6895-41f9-9ed5-7a90cb94dad6\") " pod="openstack/ovn-controller-metrics-fp98w" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.801237 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d352723-6895-41f9-9ed5-7a90cb94dad6-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-fp98w\" (UID: \"3d352723-6895-41f9-9ed5-7a90cb94dad6\") " pod="openstack/ovn-controller-metrics-fp98w" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.803143 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d352723-6895-41f9-9ed5-7a90cb94dad6-combined-ca-bundle\") pod \"ovn-controller-metrics-fp98w\" (UID: \"3d352723-6895-41f9-9ed5-7a90cb94dad6\") " pod="openstack/ovn-controller-metrics-fp98w" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.805871 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75xmv\" (UniqueName: \"kubernetes.io/projected/3d352723-6895-41f9-9ed5-7a90cb94dad6-kube-api-access-75xmv\") pod \"ovn-controller-metrics-fp98w\" (UID: \"3d352723-6895-41f9-9ed5-7a90cb94dad6\") " pod="openstack/ovn-controller-metrics-fp98w" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.866330 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21c13fea-8344-4f2d-bbe1-b6a9438cb4db-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-dwvj2\" (UID: \"21c13fea-8344-4f2d-bbe1-b6a9438cb4db\") " pod="openstack/dnsmasq-dns-7fd796d7df-dwvj2" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.866436 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21c13fea-8344-4f2d-bbe1-b6a9438cb4db-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-dwvj2\" (UID: \"21c13fea-8344-4f2d-bbe1-b6a9438cb4db\") " pod="openstack/dnsmasq-dns-7fd796d7df-dwvj2" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.866472 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21c13fea-8344-4f2d-bbe1-b6a9438cb4db-config\") pod \"dnsmasq-dns-7fd796d7df-dwvj2\" (UID: \"21c13fea-8344-4f2d-bbe1-b6a9438cb4db\") " pod="openstack/dnsmasq-dns-7fd796d7df-dwvj2" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.866511 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqzh5\" (UniqueName: \"kubernetes.io/projected/21c13fea-8344-4f2d-bbe1-b6a9438cb4db-kube-api-access-xqzh5\") pod \"dnsmasq-dns-7fd796d7df-dwvj2\" (UID: \"21c13fea-8344-4f2d-bbe1-b6a9438cb4db\") " pod="openstack/dnsmasq-dns-7fd796d7df-dwvj2" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.897530 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rqj8q"] Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.925832 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-fp98w" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.940087 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-rhqzr"] Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.947837 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.950333 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.957576 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-rhqzr"] Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.968546 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21c13fea-8344-4f2d-bbe1-b6a9438cb4db-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-dwvj2\" (UID: \"21c13fea-8344-4f2d-bbe1-b6a9438cb4db\") " pod="openstack/dnsmasq-dns-7fd796d7df-dwvj2" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.969089 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21c13fea-8344-4f2d-bbe1-b6a9438cb4db-config\") pod \"dnsmasq-dns-7fd796d7df-dwvj2\" (UID: \"21c13fea-8344-4f2d-bbe1-b6a9438cb4db\") " pod="openstack/dnsmasq-dns-7fd796d7df-dwvj2" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.969141 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqzh5\" (UniqueName: \"kubernetes.io/projected/21c13fea-8344-4f2d-bbe1-b6a9438cb4db-kube-api-access-xqzh5\") pod \"dnsmasq-dns-7fd796d7df-dwvj2\" (UID: \"21c13fea-8344-4f2d-bbe1-b6a9438cb4db\") " pod="openstack/dnsmasq-dns-7fd796d7df-dwvj2" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.969273 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21c13fea-8344-4f2d-bbe1-b6a9438cb4db-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-dwvj2\" (UID: \"21c13fea-8344-4f2d-bbe1-b6a9438cb4db\") " pod="openstack/dnsmasq-dns-7fd796d7df-dwvj2" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.971479 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21c13fea-8344-4f2d-bbe1-b6a9438cb4db-config\") pod \"dnsmasq-dns-7fd796d7df-dwvj2\" (UID: \"21c13fea-8344-4f2d-bbe1-b6a9438cb4db\") " pod="openstack/dnsmasq-dns-7fd796d7df-dwvj2" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.973702 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21c13fea-8344-4f2d-bbe1-b6a9438cb4db-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-dwvj2\" (UID: \"21c13fea-8344-4f2d-bbe1-b6a9438cb4db\") " pod="openstack/dnsmasq-dns-7fd796d7df-dwvj2" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.974758 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21c13fea-8344-4f2d-bbe1-b6a9438cb4db-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-dwvj2\" (UID: \"21c13fea-8344-4f2d-bbe1-b6a9438cb4db\") " pod="openstack/dnsmasq-dns-7fd796d7df-dwvj2" Oct 01 12:51:49 crc kubenswrapper[4727]: I1001 12:51:49.998726 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqzh5\" (UniqueName: \"kubernetes.io/projected/21c13fea-8344-4f2d-bbe1-b6a9438cb4db-kube-api-access-xqzh5\") pod \"dnsmasq-dns-7fd796d7df-dwvj2\" (UID: \"21c13fea-8344-4f2d-bbe1-b6a9438cb4db\") " pod="openstack/dnsmasq-dns-7fd796d7df-dwvj2" Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.072018 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3c6c756-7f34-4dad-a114-9e080ec40524-config\") pod \"dnsmasq-dns-86db49b7ff-rhqzr\" (UID: \"c3c6c756-7f34-4dad-a114-9e080ec40524\") " pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.072129 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3c6c756-7f34-4dad-a114-9e080ec40524-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-rhqzr\" (UID: \"c3c6c756-7f34-4dad-a114-9e080ec40524\") " pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.072202 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64w2n\" (UniqueName: \"kubernetes.io/projected/c3c6c756-7f34-4dad-a114-9e080ec40524-kube-api-access-64w2n\") pod \"dnsmasq-dns-86db49b7ff-rhqzr\" (UID: \"c3c6c756-7f34-4dad-a114-9e080ec40524\") " pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.072253 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3c6c756-7f34-4dad-a114-9e080ec40524-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-rhqzr\" (UID: \"c3c6c756-7f34-4dad-a114-9e080ec40524\") " pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.072281 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3c6c756-7f34-4dad-a114-9e080ec40524-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-rhqzr\" (UID: \"c3c6c756-7f34-4dad-a114-9e080ec40524\") " pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.084221 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-k4sr5" Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.157128 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-dwvj2" Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.175108 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztn2c\" (UniqueName: \"kubernetes.io/projected/ef5d956e-c176-4586-83a3-177604f67404-kube-api-access-ztn2c\") pod \"ef5d956e-c176-4586-83a3-177604f67404\" (UID: \"ef5d956e-c176-4586-83a3-177604f67404\") " Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.175248 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef5d956e-c176-4586-83a3-177604f67404-config\") pod \"ef5d956e-c176-4586-83a3-177604f67404\" (UID: \"ef5d956e-c176-4586-83a3-177604f67404\") " Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.175280 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef5d956e-c176-4586-83a3-177604f67404-dns-svc\") pod \"ef5d956e-c176-4586-83a3-177604f67404\" (UID: \"ef5d956e-c176-4586-83a3-177604f67404\") " Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.175527 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3c6c756-7f34-4dad-a114-9e080ec40524-config\") pod \"dnsmasq-dns-86db49b7ff-rhqzr\" (UID: \"c3c6c756-7f34-4dad-a114-9e080ec40524\") " pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.175626 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3c6c756-7f34-4dad-a114-9e080ec40524-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-rhqzr\" (UID: \"c3c6c756-7f34-4dad-a114-9e080ec40524\") " pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.175662 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64w2n\" (UniqueName: \"kubernetes.io/projected/c3c6c756-7f34-4dad-a114-9e080ec40524-kube-api-access-64w2n\") pod \"dnsmasq-dns-86db49b7ff-rhqzr\" (UID: \"c3c6c756-7f34-4dad-a114-9e080ec40524\") " pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.175714 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3c6c756-7f34-4dad-a114-9e080ec40524-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-rhqzr\" (UID: \"c3c6c756-7f34-4dad-a114-9e080ec40524\") " pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.175746 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3c6c756-7f34-4dad-a114-9e080ec40524-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-rhqzr\" (UID: \"c3c6c756-7f34-4dad-a114-9e080ec40524\") " pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.176740 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3c6c756-7f34-4dad-a114-9e080ec40524-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-rhqzr\" (UID: \"c3c6c756-7f34-4dad-a114-9e080ec40524\") " pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.177734 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3c6c756-7f34-4dad-a114-9e080ec40524-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-rhqzr\" (UID: \"c3c6c756-7f34-4dad-a114-9e080ec40524\") " pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.177933 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef5d956e-c176-4586-83a3-177604f67404-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ef5d956e-c176-4586-83a3-177604f67404" (UID: "ef5d956e-c176-4586-83a3-177604f67404"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.177941 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3c6c756-7f34-4dad-a114-9e080ec40524-config\") pod \"dnsmasq-dns-86db49b7ff-rhqzr\" (UID: \"c3c6c756-7f34-4dad-a114-9e080ec40524\") " pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.178511 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef5d956e-c176-4586-83a3-177604f67404-config" (OuterVolumeSpecName: "config") pod "ef5d956e-c176-4586-83a3-177604f67404" (UID: "ef5d956e-c176-4586-83a3-177604f67404"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.183119 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3c6c756-7f34-4dad-a114-9e080ec40524-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-rhqzr\" (UID: \"c3c6c756-7f34-4dad-a114-9e080ec40524\") " pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.184852 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef5d956e-c176-4586-83a3-177604f67404-kube-api-access-ztn2c" (OuterVolumeSpecName: "kube-api-access-ztn2c") pod "ef5d956e-c176-4586-83a3-177604f67404" (UID: "ef5d956e-c176-4586-83a3-177604f67404"). InnerVolumeSpecName "kube-api-access-ztn2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.207277 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64w2n\" (UniqueName: \"kubernetes.io/projected/c3c6c756-7f34-4dad-a114-9e080ec40524-kube-api-access-64w2n\") pod \"dnsmasq-dns-86db49b7ff-rhqzr\" (UID: \"c3c6c756-7f34-4dad-a114-9e080ec40524\") " pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.286425 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.286174 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztn2c\" (UniqueName: \"kubernetes.io/projected/ef5d956e-c176-4586-83a3-177604f67404-kube-api-access-ztn2c\") on node \"crc\" DevicePath \"\"" Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.287136 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef5d956e-c176-4586-83a3-177604f67404-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.287152 4727 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef5d956e-c176-4586-83a3-177604f67404-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.389195 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53d1ca15-34cd-43d2-9cab-bb8a54b915ea" path="/var/lib/kubelet/pods/53d1ca15-34cd-43d2-9cab-bb8a54b915ea/volumes" Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.390065 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c74af31-9f4e-4082-85b7-c83442b3444f" path="/var/lib/kubelet/pods/6c74af31-9f4e-4082-85b7-c83442b3444f/volumes" Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.576356 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-fp98w"] Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.587069 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876","Type":"ContainerStarted","Data":"cf1055988e7ddf033939cb0af9c823e7afdae43301f7bffa7308bfce6e4ca110"} Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.588479 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-k4sr5" event={"ID":"ef5d956e-c176-4586-83a3-177604f67404","Type":"ContainerDied","Data":"0df384f47380ded3e30f9108c507dccdd07289e669bad1100ebeb8ee65ccd520"} Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.588542 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-k4sr5" Oct 01 12:51:50 crc kubenswrapper[4727]: W1001 12:51:50.591216 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d352723_6895_41f9_9ed5_7a90cb94dad6.slice/crio-9294105b7ae4a376a9f907286fb0cc0feb04669b209094a89ffebfdb9284aa68 WatchSource:0}: Error finding container 9294105b7ae4a376a9f907286fb0cc0feb04669b209094a89ffebfdb9284aa68: Status 404 returned error can't find the container with id 9294105b7ae4a376a9f907286fb0cc0feb04669b209094a89ffebfdb9284aa68 Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.591382 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74ad068e-3c83-4fd2-af0a-7e45cd945411","Type":"ContainerStarted","Data":"f86c783bf5387b32ebbd44f58fac153a5bb6b813b9d45eab24f2dd3b42107af2"} Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.592658 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"02a48236-0ee4-40a0-b3eb-0f8f8de19b65","Type":"ContainerStarted","Data":"aa0c95cab3d4b10bc5da251b7f31f35f50fb6081d923977a8aa42bd7835e9cce"} Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.740276 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-dwvj2"] Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.748230 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-k4sr5"] Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.757741 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-k4sr5"] Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.823445 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-rhqzr"] Oct 01 12:51:50 crc kubenswrapper[4727]: W1001 12:51:50.839208 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3c6c756_7f34_4dad_a114_9e080ec40524.slice/crio-13b4d49df791acd152222a7b10d91c3560791c5aa287af54711e9c70d1624f82 WatchSource:0}: Error finding container 13b4d49df791acd152222a7b10d91c3560791c5aa287af54711e9c70d1624f82: Status 404 returned error can't find the container with id 13b4d49df791acd152222a7b10d91c3560791c5aa287af54711e9c70d1624f82 Oct 01 12:51:50 crc kubenswrapper[4727]: I1001 12:51:50.910715 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rqj8q" Oct 01 12:51:51 crc kubenswrapper[4727]: I1001 12:51:51.000694 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4hrx\" (UniqueName: \"kubernetes.io/projected/dd17053d-9c00-4932-bace-9382d74d3094-kube-api-access-w4hrx\") pod \"dd17053d-9c00-4932-bace-9382d74d3094\" (UID: \"dd17053d-9c00-4932-bace-9382d74d3094\") " Oct 01 12:51:51 crc kubenswrapper[4727]: I1001 12:51:51.001237 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd17053d-9c00-4932-bace-9382d74d3094-dns-svc\") pod \"dd17053d-9c00-4932-bace-9382d74d3094\" (UID: \"dd17053d-9c00-4932-bace-9382d74d3094\") " Oct 01 12:51:51 crc kubenswrapper[4727]: I1001 12:51:51.001303 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd17053d-9c00-4932-bace-9382d74d3094-config\") pod \"dd17053d-9c00-4932-bace-9382d74d3094\" (UID: \"dd17053d-9c00-4932-bace-9382d74d3094\") " Oct 01 12:51:51 crc kubenswrapper[4727]: I1001 12:51:51.001993 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd17053d-9c00-4932-bace-9382d74d3094-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dd17053d-9c00-4932-bace-9382d74d3094" (UID: "dd17053d-9c00-4932-bace-9382d74d3094"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:51:51 crc kubenswrapper[4727]: I1001 12:51:51.002378 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd17053d-9c00-4932-bace-9382d74d3094-config" (OuterVolumeSpecName: "config") pod "dd17053d-9c00-4932-bace-9382d74d3094" (UID: "dd17053d-9c00-4932-bace-9382d74d3094"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:51:51 crc kubenswrapper[4727]: I1001 12:51:51.006166 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd17053d-9c00-4932-bace-9382d74d3094-kube-api-access-w4hrx" (OuterVolumeSpecName: "kube-api-access-w4hrx") pod "dd17053d-9c00-4932-bace-9382d74d3094" (UID: "dd17053d-9c00-4932-bace-9382d74d3094"). InnerVolumeSpecName "kube-api-access-w4hrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:51:51 crc kubenswrapper[4727]: I1001 12:51:51.104627 4727 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd17053d-9c00-4932-bace-9382d74d3094-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:51:51 crc kubenswrapper[4727]: I1001 12:51:51.104704 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd17053d-9c00-4932-bace-9382d74d3094-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:51:51 crc kubenswrapper[4727]: I1001 12:51:51.104716 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4hrx\" (UniqueName: \"kubernetes.io/projected/dd17053d-9c00-4932-bace-9382d74d3094-kube-api-access-w4hrx\") on node \"crc\" DevicePath \"\"" Oct 01 12:51:51 crc kubenswrapper[4727]: I1001 12:51:51.603421 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" event={"ID":"c3c6c756-7f34-4dad-a114-9e080ec40524","Type":"ContainerStarted","Data":"13b4d49df791acd152222a7b10d91c3560791c5aa287af54711e9c70d1624f82"} Oct 01 12:51:51 crc kubenswrapper[4727]: I1001 12:51:51.605208 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-fp98w" event={"ID":"3d352723-6895-41f9-9ed5-7a90cb94dad6","Type":"ContainerStarted","Data":"9294105b7ae4a376a9f907286fb0cc0feb04669b209094a89ffebfdb9284aa68"} Oct 01 12:51:51 crc kubenswrapper[4727]: I1001 12:51:51.608692 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-dwvj2" event={"ID":"21c13fea-8344-4f2d-bbe1-b6a9438cb4db","Type":"ContainerStarted","Data":"07ae3d2e0b4703a3b068d8c7e04f9292ce9e4230ac1d327ce34f6552d68d0e4c"} Oct 01 12:51:51 crc kubenswrapper[4727]: I1001 12:51:51.612990 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rqj8q" Oct 01 12:51:51 crc kubenswrapper[4727]: I1001 12:51:51.616113 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rqj8q" event={"ID":"dd17053d-9c00-4932-bace-9382d74d3094","Type":"ContainerDied","Data":"7b7cc177f083547256dff22d000eec32dd501dc7bea684efbde1747792f33056"} Oct 01 12:51:51 crc kubenswrapper[4727]: I1001 12:51:51.676945 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rqj8q"] Oct 01 12:51:51 crc kubenswrapper[4727]: I1001 12:51:51.686033 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rqj8q"] Oct 01 12:51:52 crc kubenswrapper[4727]: I1001 12:51:52.385192 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd17053d-9c00-4932-bace-9382d74d3094" path="/var/lib/kubelet/pods/dd17053d-9c00-4932-bace-9382d74d3094/volumes" Oct 01 12:51:52 crc kubenswrapper[4727]: I1001 12:51:52.385856 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef5d956e-c176-4586-83a3-177604f67404" path="/var/lib/kubelet/pods/ef5d956e-c176-4586-83a3-177604f67404/volumes" Oct 01 12:51:54 crc kubenswrapper[4727]: I1001 12:51:54.634358 4727 generic.go:334] "Generic (PLEG): container finished" podID="01e2d457-092b-4b9d-a5fc-375a59758259" containerID="d176cdd886f8fec4a2c323837ab84096c245b6885275129b578549a76e8ab25c" exitCode=0 Oct 01 12:51:54 crc kubenswrapper[4727]: I1001 12:51:54.634463 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"01e2d457-092b-4b9d-a5fc-375a59758259","Type":"ContainerDied","Data":"d176cdd886f8fec4a2c323837ab84096c245b6885275129b578549a76e8ab25c"} Oct 01 12:51:57 crc kubenswrapper[4727]: I1001 12:51:57.688807 4727 generic.go:334] "Generic (PLEG): container finished" podID="85f4f20e-6398-4386-a1b1-d34d0a4159b3" containerID="4c5135d1b1f4291f06f07bccc32a5a5d83135936d8fcf840914b6220e2e924cd" exitCode=0 Oct 01 12:51:57 crc kubenswrapper[4727]: I1001 12:51:57.689293 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-f9xhb" event={"ID":"85f4f20e-6398-4386-a1b1-d34d0a4159b3","Type":"ContainerDied","Data":"4c5135d1b1f4291f06f07bccc32a5a5d83135936d8fcf840914b6220e2e924cd"} Oct 01 12:51:57 crc kubenswrapper[4727]: I1001 12:51:57.694306 4727 generic.go:334] "Generic (PLEG): container finished" podID="21c13fea-8344-4f2d-bbe1-b6a9438cb4db" containerID="0a635e7cefef58e8f5c3dd4c60f12bf73a1c41cafcb6c947201ae2e26d9f8650" exitCode=0 Oct 01 12:51:57 crc kubenswrapper[4727]: I1001 12:51:57.694383 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-dwvj2" event={"ID":"21c13fea-8344-4f2d-bbe1-b6a9438cb4db","Type":"ContainerDied","Data":"0a635e7cefef58e8f5c3dd4c60f12bf73a1c41cafcb6c947201ae2e26d9f8650"} Oct 01 12:51:57 crc kubenswrapper[4727]: I1001 12:51:57.702060 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"02a48236-0ee4-40a0-b3eb-0f8f8de19b65","Type":"ContainerStarted","Data":"9355f566d1663003987b73b33e6559e5e71dea3468dba99f7ddf9d250cca0ea6"} Oct 01 12:51:57 crc kubenswrapper[4727]: I1001 12:51:57.702115 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"02a48236-0ee4-40a0-b3eb-0f8f8de19b65","Type":"ContainerStarted","Data":"a4138a0d8810384926493cd53bff198452c5aa4e290dd759d9c60e984665210f"} Oct 01 12:51:57 crc kubenswrapper[4727]: I1001 12:51:57.703926 4727 generic.go:334] "Generic (PLEG): container finished" podID="c3c6c756-7f34-4dad-a114-9e080ec40524" containerID="fb09c277734d73aee838d0fe4c5093bcaf07692683a6fe004a6be9082ba7b938" exitCode=0 Oct 01 12:51:57 crc kubenswrapper[4727]: I1001 12:51:57.703981 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" event={"ID":"c3c6c756-7f34-4dad-a114-9e080ec40524","Type":"ContainerDied","Data":"fb09c277734d73aee838d0fe4c5093bcaf07692683a6fe004a6be9082ba7b938"} Oct 01 12:51:57 crc kubenswrapper[4727]: I1001 12:51:57.706928 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-fp98w" event={"ID":"3d352723-6895-41f9-9ed5-7a90cb94dad6","Type":"ContainerStarted","Data":"d4d1195cb25d727eca05072a11f428fb3ec938b38eea82800e38031c3c76f831"} Oct 01 12:51:57 crc kubenswrapper[4727]: I1001 12:51:57.709751 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"01e2d457-092b-4b9d-a5fc-375a59758259","Type":"ContainerStarted","Data":"509a8cac158fff83eeb43df638753ca77919023838f1ec0ec5471327f4677b3d"} Oct 01 12:51:57 crc kubenswrapper[4727]: I1001 12:51:57.717391 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0eab2ee0-0da5-4935-bb03-270c81efbe57","Type":"ContainerStarted","Data":"2b99308fe36029642cc35723f42d6c64b04bf0f65a8d2ce0402b9321ef4d8a13"} Oct 01 12:51:57 crc kubenswrapper[4727]: I1001 12:51:57.717454 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0eab2ee0-0da5-4935-bb03-270c81efbe57","Type":"ContainerStarted","Data":"98956b8d0b2622bde72fff3b2b3cc49e042b6856a9b803a084b18177da2b0575"} Oct 01 12:51:57 crc kubenswrapper[4727]: I1001 12:51:57.720821 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d10e37bd-ab54-4798-bfa1-a94f2e13eba0","Type":"ContainerStarted","Data":"e9917b524df8b31dde2ac104c514071d39e8719b9953358ecae049381d21d899"} Oct 01 12:51:57 crc kubenswrapper[4727]: I1001 12:51:57.723568 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4226303d-42f1-4267-ac66-5db1def22a4b","Type":"ContainerStarted","Data":"f1fb8e78533bbf52baf5a0c050a858cd1f1c1685dfb78834d97e9dca7b4d4504"} Oct 01 12:51:57 crc kubenswrapper[4727]: I1001 12:51:57.723755 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 01 12:51:57 crc kubenswrapper[4727]: I1001 12:51:57.726835 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v56sx" event={"ID":"fb0c554e-ed3f-4476-9963-dabc0089698d","Type":"ContainerStarted","Data":"749477be554f34ec042de3f62476b4ebbc190e321175cf7f40a73a616e6e930d"} Oct 01 12:51:57 crc kubenswrapper[4727]: I1001 12:51:57.727071 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-v56sx" Oct 01 12:51:57 crc kubenswrapper[4727]: I1001 12:51:57.770308 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=23.76984655 podStartE2EDuration="29.77029094s" podCreationTimestamp="2025-10-01 12:51:28 +0000 UTC" firstStartedPulling="2025-10-01 12:51:49.610157252 +0000 UTC m=+887.931512079" lastFinishedPulling="2025-10-01 12:51:55.610601632 +0000 UTC m=+893.931956469" observedRunningTime="2025-10-01 12:51:57.766851251 +0000 UTC m=+896.088206108" watchObservedRunningTime="2025-10-01 12:51:57.77029094 +0000 UTC m=+896.091645777" Oct 01 12:51:57 crc kubenswrapper[4727]: I1001 12:51:57.828452 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=11.727112229 podStartE2EDuration="38.828431162s" podCreationTimestamp="2025-10-01 12:51:19 +0000 UTC" firstStartedPulling="2025-10-01 12:51:21.383165797 +0000 UTC m=+859.704520634" lastFinishedPulling="2025-10-01 12:51:48.48448473 +0000 UTC m=+886.805839567" observedRunningTime="2025-10-01 12:51:57.822293089 +0000 UTC m=+896.143647936" watchObservedRunningTime="2025-10-01 12:51:57.828431162 +0000 UTC m=+896.149786009" Oct 01 12:51:57 crc kubenswrapper[4727]: I1001 12:51:57.853786 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-fp98w" podStartSLOduration=3.119320173 podStartE2EDuration="8.853767081s" podCreationTimestamp="2025-10-01 12:51:49 +0000 UTC" firstStartedPulling="2025-10-01 12:51:50.596273897 +0000 UTC m=+888.917628734" lastFinishedPulling="2025-10-01 12:51:56.330720805 +0000 UTC m=+894.652075642" observedRunningTime="2025-10-01 12:51:57.849174715 +0000 UTC m=+896.170529562" watchObservedRunningTime="2025-10-01 12:51:57.853767081 +0000 UTC m=+896.175121928" Oct 01 12:51:57 crc kubenswrapper[4727]: I1001 12:51:57.958767 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=27.247510741 podStartE2EDuration="33.958748479s" podCreationTimestamp="2025-10-01 12:51:24 +0000 UTC" firstStartedPulling="2025-10-01 12:51:47.436843617 +0000 UTC m=+885.758198454" lastFinishedPulling="2025-10-01 12:51:54.148081355 +0000 UTC m=+892.469436192" observedRunningTime="2025-10-01 12:51:57.95751502 +0000 UTC m=+896.278869867" watchObservedRunningTime="2025-10-01 12:51:57.958748479 +0000 UTC m=+896.280103316" Oct 01 12:51:57 crc kubenswrapper[4727]: I1001 12:51:57.988441 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=28.908254769 podStartE2EDuration="35.988418584s" podCreationTimestamp="2025-10-01 12:51:22 +0000 UTC" firstStartedPulling="2025-10-01 12:51:49.281940289 +0000 UTC m=+887.603295126" lastFinishedPulling="2025-10-01 12:51:56.362104104 +0000 UTC m=+894.683458941" observedRunningTime="2025-10-01 12:51:57.978809291 +0000 UTC m=+896.300164128" watchObservedRunningTime="2025-10-01 12:51:57.988418584 +0000 UTC m=+896.309773421" Oct 01 12:51:58 crc kubenswrapper[4727]: I1001 12:51:58.019644 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-v56sx" podStartSLOduration=27.561512138 podStartE2EDuration="33.019621387s" podCreationTimestamp="2025-10-01 12:51:25 +0000 UTC" firstStartedPulling="2025-10-01 12:51:48.832650802 +0000 UTC m=+887.154005629" lastFinishedPulling="2025-10-01 12:51:54.290760041 +0000 UTC m=+892.612114878" observedRunningTime="2025-10-01 12:51:58.005409189 +0000 UTC m=+896.326764036" watchObservedRunningTime="2025-10-01 12:51:58.019621387 +0000 UTC m=+896.340976224" Oct 01 12:51:58 crc kubenswrapper[4727]: I1001 12:51:58.741724 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-dwvj2" event={"ID":"21c13fea-8344-4f2d-bbe1-b6a9438cb4db","Type":"ContainerStarted","Data":"c1284ee0cd860bd644d0c03f74242eec5e05a409d87bbd0ec28b2ea183c98618"} Oct 01 12:51:58 crc kubenswrapper[4727]: I1001 12:51:58.742143 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-dwvj2" Oct 01 12:51:58 crc kubenswrapper[4727]: I1001 12:51:58.748222 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" event={"ID":"c3c6c756-7f34-4dad-a114-9e080ec40524","Type":"ContainerStarted","Data":"9f636215c2cd2f73135c252052e48e0d8a1981981be761fcbe6bff1cbcba7277"} Oct 01 12:51:58 crc kubenswrapper[4727]: I1001 12:51:58.748342 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" Oct 01 12:51:58 crc kubenswrapper[4727]: I1001 12:51:58.755160 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-f9xhb" event={"ID":"85f4f20e-6398-4386-a1b1-d34d0a4159b3","Type":"ContainerStarted","Data":"5e4877e27bd12bb2b053a561fbccf5204737b8470a223c1dc5ed35ef336aa79c"} Oct 01 12:51:58 crc kubenswrapper[4727]: I1001 12:51:58.755236 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-f9xhb" event={"ID":"85f4f20e-6398-4386-a1b1-d34d0a4159b3","Type":"ContainerStarted","Data":"6d5750a41e084a7a2f58486c9c057510217e7caeedcf4f475a6f628a0e877d63"} Oct 01 12:51:58 crc kubenswrapper[4727]: I1001 12:51:58.765304 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-dwvj2" podStartSLOduration=6.301354457 podStartE2EDuration="9.765289445s" podCreationTimestamp="2025-10-01 12:51:49 +0000 UTC" firstStartedPulling="2025-10-01 12:51:50.770226619 +0000 UTC m=+889.091581456" lastFinishedPulling="2025-10-01 12:51:54.234161617 +0000 UTC m=+892.555516444" observedRunningTime="2025-10-01 12:51:58.763731775 +0000 UTC m=+897.085086612" watchObservedRunningTime="2025-10-01 12:51:58.765289445 +0000 UTC m=+897.086644282" Oct 01 12:51:58 crc kubenswrapper[4727]: I1001 12:51:58.789494 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" podStartSLOduration=6.592335746 podStartE2EDuration="9.789474537s" podCreationTimestamp="2025-10-01 12:51:49 +0000 UTC" firstStartedPulling="2025-10-01 12:51:50.841622428 +0000 UTC m=+889.162977265" lastFinishedPulling="2025-10-01 12:51:54.038761219 +0000 UTC m=+892.360116056" observedRunningTime="2025-10-01 12:51:58.783563641 +0000 UTC m=+897.104918508" watchObservedRunningTime="2025-10-01 12:51:58.789474537 +0000 UTC m=+897.110829374" Oct 01 12:51:58 crc kubenswrapper[4727]: I1001 12:51:58.808131 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-f9xhb" podStartSLOduration=28.463012577 podStartE2EDuration="33.808115035s" podCreationTimestamp="2025-10-01 12:51:25 +0000 UTC" firstStartedPulling="2025-10-01 12:51:48.802372097 +0000 UTC m=+887.123726934" lastFinishedPulling="2025-10-01 12:51:54.147474555 +0000 UTC m=+892.468829392" observedRunningTime="2025-10-01 12:51:58.804400918 +0000 UTC m=+897.125755775" watchObservedRunningTime="2025-10-01 12:51:58.808115035 +0000 UTC m=+897.129469862" Oct 01 12:51:59 crc kubenswrapper[4727]: I1001 12:51:59.176768 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:59 crc kubenswrapper[4727]: I1001 12:51:59.219118 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:59 crc kubenswrapper[4727]: I1001 12:51:59.762261 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 01 12:51:59 crc kubenswrapper[4727]: I1001 12:51:59.762314 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-f9xhb" Oct 01 12:51:59 crc kubenswrapper[4727]: I1001 12:51:59.762328 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-f9xhb" Oct 01 12:51:59 crc kubenswrapper[4727]: I1001 12:51:59.965838 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 01 12:51:59 crc kubenswrapper[4727]: I1001 12:51:59.965892 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:00 crc kubenswrapper[4727]: I1001 12:52:00.011709 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:00 crc kubenswrapper[4727]: I1001 12:52:00.567676 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:00 crc kubenswrapper[4727]: I1001 12:52:00.567721 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:00 crc kubenswrapper[4727]: I1001 12:52:00.771702 4727 generic.go:334] "Generic (PLEG): container finished" podID="d10e37bd-ab54-4798-bfa1-a94f2e13eba0" containerID="e9917b524df8b31dde2ac104c514071d39e8719b9953358ecae049381d21d899" exitCode=0 Oct 01 12:52:00 crc kubenswrapper[4727]: I1001 12:52:00.771787 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d10e37bd-ab54-4798-bfa1-a94f2e13eba0","Type":"ContainerDied","Data":"e9917b524df8b31dde2ac104c514071d39e8719b9953358ecae049381d21d899"} Oct 01 12:52:01 crc kubenswrapper[4727]: I1001 12:52:01.230681 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 01 12:52:01 crc kubenswrapper[4727]: I1001 12:52:01.781573 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d10e37bd-ab54-4798-bfa1-a94f2e13eba0","Type":"ContainerStarted","Data":"3c3b3e2e7233841f993e73860e51a843dd01b889fc5517fe549be6f217603cbe"} Oct 01 12:52:01 crc kubenswrapper[4727]: I1001 12:52:01.810214 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371993.04459 podStartE2EDuration="43.810186028s" podCreationTimestamp="2025-10-01 12:51:18 +0000 UTC" firstStartedPulling="2025-10-01 12:51:21.15922641 +0000 UTC m=+859.480581247" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:52:01.802720863 +0000 UTC m=+900.124075730" watchObservedRunningTime="2025-10-01 12:52:01.810186028 +0000 UTC m=+900.131540905" Oct 01 12:52:02 crc kubenswrapper[4727]: I1001 12:52:02.623973 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:02 crc kubenswrapper[4727]: I1001 12:52:02.666561 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 01 12:52:02 crc kubenswrapper[4727]: I1001 12:52:02.668577 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 01 12:52:03 crc kubenswrapper[4727]: I1001 12:52:03.291750 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:52:03 crc kubenswrapper[4727]: I1001 12:52:03.291825 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:52:03 crc kubenswrapper[4727]: I1001 12:52:03.815133 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"fca4f477-e812-4926-9935-8bfc1e2ca89a","Type":"ContainerStarted","Data":"bcc062d5558e5804deb5d9ba5560f8d86be2b529fa69862de9a4920705710b50"} Oct 01 12:52:03 crc kubenswrapper[4727]: I1001 12:52:03.815477 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 01 12:52:03 crc kubenswrapper[4727]: I1001 12:52:03.841861 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.438500693 podStartE2EDuration="43.841843451s" podCreationTimestamp="2025-10-01 12:51:20 +0000 UTC" firstStartedPulling="2025-10-01 12:51:21.668325393 +0000 UTC m=+859.989680230" lastFinishedPulling="2025-10-01 12:52:03.071668151 +0000 UTC m=+901.393022988" observedRunningTime="2025-10-01 12:52:03.841654825 +0000 UTC m=+902.163009682" watchObservedRunningTime="2025-10-01 12:52:03.841843451 +0000 UTC m=+902.163198288" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.030410 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.160044 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-dwvj2" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.289198 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.311447 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.312976 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.316666 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-pbnhn" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.316735 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.316776 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.316741 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.330394 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.370134 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dcad3e1-a101-49be-a117-fe48c45b2ab5-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9dcad3e1-a101-49be-a117-fe48c45b2ab5\") " pod="openstack/ovn-northd-0" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.370196 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dcad3e1-a101-49be-a117-fe48c45b2ab5-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9dcad3e1-a101-49be-a117-fe48c45b2ab5\") " pod="openstack/ovn-northd-0" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.370267 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9dcad3e1-a101-49be-a117-fe48c45b2ab5-scripts\") pod \"ovn-northd-0\" (UID: \"9dcad3e1-a101-49be-a117-fe48c45b2ab5\") " pod="openstack/ovn-northd-0" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.370355 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qng59\" (UniqueName: \"kubernetes.io/projected/9dcad3e1-a101-49be-a117-fe48c45b2ab5-kube-api-access-qng59\") pod \"ovn-northd-0\" (UID: \"9dcad3e1-a101-49be-a117-fe48c45b2ab5\") " pod="openstack/ovn-northd-0" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.370404 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9dcad3e1-a101-49be-a117-fe48c45b2ab5-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9dcad3e1-a101-49be-a117-fe48c45b2ab5\") " pod="openstack/ovn-northd-0" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.370433 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9dcad3e1-a101-49be-a117-fe48c45b2ab5-config\") pod \"ovn-northd-0\" (UID: \"9dcad3e1-a101-49be-a117-fe48c45b2ab5\") " pod="openstack/ovn-northd-0" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.370452 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dcad3e1-a101-49be-a117-fe48c45b2ab5-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9dcad3e1-a101-49be-a117-fe48c45b2ab5\") " pod="openstack/ovn-northd-0" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.407218 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-dwvj2"] Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.472396 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9dcad3e1-a101-49be-a117-fe48c45b2ab5-scripts\") pod \"ovn-northd-0\" (UID: \"9dcad3e1-a101-49be-a117-fe48c45b2ab5\") " pod="openstack/ovn-northd-0" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.472734 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qng59\" (UniqueName: \"kubernetes.io/projected/9dcad3e1-a101-49be-a117-fe48c45b2ab5-kube-api-access-qng59\") pod \"ovn-northd-0\" (UID: \"9dcad3e1-a101-49be-a117-fe48c45b2ab5\") " pod="openstack/ovn-northd-0" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.472865 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9dcad3e1-a101-49be-a117-fe48c45b2ab5-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9dcad3e1-a101-49be-a117-fe48c45b2ab5\") " pod="openstack/ovn-northd-0" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.472970 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9dcad3e1-a101-49be-a117-fe48c45b2ab5-config\") pod \"ovn-northd-0\" (UID: \"9dcad3e1-a101-49be-a117-fe48c45b2ab5\") " pod="openstack/ovn-northd-0" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.473070 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dcad3e1-a101-49be-a117-fe48c45b2ab5-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9dcad3e1-a101-49be-a117-fe48c45b2ab5\") " pod="openstack/ovn-northd-0" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.473241 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dcad3e1-a101-49be-a117-fe48c45b2ab5-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9dcad3e1-a101-49be-a117-fe48c45b2ab5\") " pod="openstack/ovn-northd-0" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.473319 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9dcad3e1-a101-49be-a117-fe48c45b2ab5-scripts\") pod \"ovn-northd-0\" (UID: \"9dcad3e1-a101-49be-a117-fe48c45b2ab5\") " pod="openstack/ovn-northd-0" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.473349 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9dcad3e1-a101-49be-a117-fe48c45b2ab5-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9dcad3e1-a101-49be-a117-fe48c45b2ab5\") " pod="openstack/ovn-northd-0" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.473336 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dcad3e1-a101-49be-a117-fe48c45b2ab5-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9dcad3e1-a101-49be-a117-fe48c45b2ab5\") " pod="openstack/ovn-northd-0" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.473869 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9dcad3e1-a101-49be-a117-fe48c45b2ab5-config\") pod \"ovn-northd-0\" (UID: \"9dcad3e1-a101-49be-a117-fe48c45b2ab5\") " pod="openstack/ovn-northd-0" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.478950 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dcad3e1-a101-49be-a117-fe48c45b2ab5-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"9dcad3e1-a101-49be-a117-fe48c45b2ab5\") " pod="openstack/ovn-northd-0" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.479178 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dcad3e1-a101-49be-a117-fe48c45b2ab5-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9dcad3e1-a101-49be-a117-fe48c45b2ab5\") " pod="openstack/ovn-northd-0" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.479461 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dcad3e1-a101-49be-a117-fe48c45b2ab5-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"9dcad3e1-a101-49be-a117-fe48c45b2ab5\") " pod="openstack/ovn-northd-0" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.491338 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qng59\" (UniqueName: \"kubernetes.io/projected/9dcad3e1-a101-49be-a117-fe48c45b2ab5-kube-api-access-qng59\") pod \"ovn-northd-0\" (UID: \"9dcad3e1-a101-49be-a117-fe48c45b2ab5\") " pod="openstack/ovn-northd-0" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.638030 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 01 12:52:05 crc kubenswrapper[4727]: I1001 12:52:05.840764 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-dwvj2" podUID="21c13fea-8344-4f2d-bbe1-b6a9438cb4db" containerName="dnsmasq-dns" containerID="cri-o://c1284ee0cd860bd644d0c03f74242eec5e05a409d87bbd0ec28b2ea183c98618" gracePeriod=10 Oct 01 12:52:06 crc kubenswrapper[4727]: I1001 12:52:06.115881 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 01 12:52:06 crc kubenswrapper[4727]: W1001 12:52:06.120245 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dcad3e1_a101_49be_a117_fe48c45b2ab5.slice/crio-f2b13c8bd1efbe9ed9ab4056653530c4579a0e1363c1b12dc9d8e1dcad0b626c WatchSource:0}: Error finding container f2b13c8bd1efbe9ed9ab4056653530c4579a0e1363c1b12dc9d8e1dcad0b626c: Status 404 returned error can't find the container with id f2b13c8bd1efbe9ed9ab4056653530c4579a0e1363c1b12dc9d8e1dcad0b626c Oct 01 12:52:06 crc kubenswrapper[4727]: I1001 12:52:06.850366 4727 generic.go:334] "Generic (PLEG): container finished" podID="21c13fea-8344-4f2d-bbe1-b6a9438cb4db" containerID="c1284ee0cd860bd644d0c03f74242eec5e05a409d87bbd0ec28b2ea183c98618" exitCode=0 Oct 01 12:52:06 crc kubenswrapper[4727]: I1001 12:52:06.850441 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-dwvj2" event={"ID":"21c13fea-8344-4f2d-bbe1-b6a9438cb4db","Type":"ContainerDied","Data":"c1284ee0cd860bd644d0c03f74242eec5e05a409d87bbd0ec28b2ea183c98618"} Oct 01 12:52:06 crc kubenswrapper[4727]: I1001 12:52:06.851875 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9dcad3e1-a101-49be-a117-fe48c45b2ab5","Type":"ContainerStarted","Data":"f2b13c8bd1efbe9ed9ab4056653530c4579a0e1363c1b12dc9d8e1dcad0b626c"} Oct 01 12:52:07 crc kubenswrapper[4727]: I1001 12:52:07.748323 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-dwvj2" Oct 01 12:52:07 crc kubenswrapper[4727]: I1001 12:52:07.816059 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21c13fea-8344-4f2d-bbe1-b6a9438cb4db-dns-svc\") pod \"21c13fea-8344-4f2d-bbe1-b6a9438cb4db\" (UID: \"21c13fea-8344-4f2d-bbe1-b6a9438cb4db\") " Oct 01 12:52:07 crc kubenswrapper[4727]: I1001 12:52:07.816136 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21c13fea-8344-4f2d-bbe1-b6a9438cb4db-config\") pod \"21c13fea-8344-4f2d-bbe1-b6a9438cb4db\" (UID: \"21c13fea-8344-4f2d-bbe1-b6a9438cb4db\") " Oct 01 12:52:07 crc kubenswrapper[4727]: I1001 12:52:07.816183 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqzh5\" (UniqueName: \"kubernetes.io/projected/21c13fea-8344-4f2d-bbe1-b6a9438cb4db-kube-api-access-xqzh5\") pod \"21c13fea-8344-4f2d-bbe1-b6a9438cb4db\" (UID: \"21c13fea-8344-4f2d-bbe1-b6a9438cb4db\") " Oct 01 12:52:07 crc kubenswrapper[4727]: I1001 12:52:07.816225 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21c13fea-8344-4f2d-bbe1-b6a9438cb4db-ovsdbserver-nb\") pod \"21c13fea-8344-4f2d-bbe1-b6a9438cb4db\" (UID: \"21c13fea-8344-4f2d-bbe1-b6a9438cb4db\") " Oct 01 12:52:07 crc kubenswrapper[4727]: I1001 12:52:07.854403 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21c13fea-8344-4f2d-bbe1-b6a9438cb4db-kube-api-access-xqzh5" (OuterVolumeSpecName: "kube-api-access-xqzh5") pod "21c13fea-8344-4f2d-bbe1-b6a9438cb4db" (UID: "21c13fea-8344-4f2d-bbe1-b6a9438cb4db"). InnerVolumeSpecName "kube-api-access-xqzh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:52:07 crc kubenswrapper[4727]: I1001 12:52:07.889219 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21c13fea-8344-4f2d-bbe1-b6a9438cb4db-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "21c13fea-8344-4f2d-bbe1-b6a9438cb4db" (UID: "21c13fea-8344-4f2d-bbe1-b6a9438cb4db"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:52:07 crc kubenswrapper[4727]: I1001 12:52:07.918101 4727 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21c13fea-8344-4f2d-bbe1-b6a9438cb4db-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:07 crc kubenswrapper[4727]: I1001 12:52:07.918152 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqzh5\" (UniqueName: \"kubernetes.io/projected/21c13fea-8344-4f2d-bbe1-b6a9438cb4db-kube-api-access-xqzh5\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:07 crc kubenswrapper[4727]: I1001 12:52:07.926252 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-dwvj2" event={"ID":"21c13fea-8344-4f2d-bbe1-b6a9438cb4db","Type":"ContainerDied","Data":"07ae3d2e0b4703a3b068d8c7e04f9292ce9e4230ac1d327ce34f6552d68d0e4c"} Oct 01 12:52:07 crc kubenswrapper[4727]: I1001 12:52:07.926308 4727 scope.go:117] "RemoveContainer" containerID="c1284ee0cd860bd644d0c03f74242eec5e05a409d87bbd0ec28b2ea183c98618" Oct 01 12:52:07 crc kubenswrapper[4727]: I1001 12:52:07.926458 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-dwvj2" Oct 01 12:52:07 crc kubenswrapper[4727]: I1001 12:52:07.950547 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21c13fea-8344-4f2d-bbe1-b6a9438cb4db-config" (OuterVolumeSpecName: "config") pod "21c13fea-8344-4f2d-bbe1-b6a9438cb4db" (UID: "21c13fea-8344-4f2d-bbe1-b6a9438cb4db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:52:07 crc kubenswrapper[4727]: I1001 12:52:07.954856 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21c13fea-8344-4f2d-bbe1-b6a9438cb4db-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "21c13fea-8344-4f2d-bbe1-b6a9438cb4db" (UID: "21c13fea-8344-4f2d-bbe1-b6a9438cb4db"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:52:07 crc kubenswrapper[4727]: I1001 12:52:07.993919 4727 scope.go:117] "RemoveContainer" containerID="0a635e7cefef58e8f5c3dd4c60f12bf73a1c41cafcb6c947201ae2e26d9f8650" Oct 01 12:52:08 crc kubenswrapper[4727]: I1001 12:52:08.020050 4727 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21c13fea-8344-4f2d-bbe1-b6a9438cb4db-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:08 crc kubenswrapper[4727]: I1001 12:52:08.020096 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21c13fea-8344-4f2d-bbe1-b6a9438cb4db-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:08 crc kubenswrapper[4727]: I1001 12:52:08.257891 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-dwvj2"] Oct 01 12:52:08 crc kubenswrapper[4727]: I1001 12:52:08.263745 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-dwvj2"] Oct 01 12:52:08 crc kubenswrapper[4727]: I1001 12:52:08.388801 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21c13fea-8344-4f2d-bbe1-b6a9438cb4db" path="/var/lib/kubelet/pods/21c13fea-8344-4f2d-bbe1-b6a9438cb4db/volumes" Oct 01 12:52:08 crc kubenswrapper[4727]: I1001 12:52:08.936233 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9dcad3e1-a101-49be-a117-fe48c45b2ab5","Type":"ContainerStarted","Data":"7ca320ceb72bf3716c91377edab858d490caf638ec3fc80c1ffddb18cf5b4f56"} Oct 01 12:52:08 crc kubenswrapper[4727]: I1001 12:52:08.936275 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9dcad3e1-a101-49be-a117-fe48c45b2ab5","Type":"ContainerStarted","Data":"73feba35141dc33de29285539fefdc809dbe97ca60de7deee56a6b70276668c5"} Oct 01 12:52:08 crc kubenswrapper[4727]: I1001 12:52:08.936395 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 01 12:52:08 crc kubenswrapper[4727]: I1001 12:52:08.958237 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.7888093870000001 podStartE2EDuration="3.958216172s" podCreationTimestamp="2025-10-01 12:52:05 +0000 UTC" firstStartedPulling="2025-10-01 12:52:06.12248636 +0000 UTC m=+904.443841197" lastFinishedPulling="2025-10-01 12:52:08.291893145 +0000 UTC m=+906.613247982" observedRunningTime="2025-10-01 12:52:08.954765363 +0000 UTC m=+907.276120220" watchObservedRunningTime="2025-10-01 12:52:08.958216172 +0000 UTC m=+907.279571019" Oct 01 12:52:10 crc kubenswrapper[4727]: I1001 12:52:10.116203 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 01 12:52:10 crc kubenswrapper[4727]: I1001 12:52:10.116568 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 01 12:52:10 crc kubenswrapper[4727]: I1001 12:52:10.161914 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 01 12:52:10 crc kubenswrapper[4727]: I1001 12:52:10.889200 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 01 12:52:11 crc kubenswrapper[4727]: I1001 12:52:11.005459 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 01 12:52:12 crc kubenswrapper[4727]: I1001 12:52:12.762583 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-j2xww"] Oct 01 12:52:12 crc kubenswrapper[4727]: E1001 12:52:12.763890 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21c13fea-8344-4f2d-bbe1-b6a9438cb4db" containerName="dnsmasq-dns" Oct 01 12:52:12 crc kubenswrapper[4727]: I1001 12:52:12.763915 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c13fea-8344-4f2d-bbe1-b6a9438cb4db" containerName="dnsmasq-dns" Oct 01 12:52:12 crc kubenswrapper[4727]: E1001 12:52:12.763942 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21c13fea-8344-4f2d-bbe1-b6a9438cb4db" containerName="init" Oct 01 12:52:12 crc kubenswrapper[4727]: I1001 12:52:12.763948 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c13fea-8344-4f2d-bbe1-b6a9438cb4db" containerName="init" Oct 01 12:52:12 crc kubenswrapper[4727]: I1001 12:52:12.764128 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="21c13fea-8344-4f2d-bbe1-b6a9438cb4db" containerName="dnsmasq-dns" Oct 01 12:52:12 crc kubenswrapper[4727]: I1001 12:52:12.764948 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-j2xww" Oct 01 12:52:12 crc kubenswrapper[4727]: I1001 12:52:12.784032 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-j2xww"] Oct 01 12:52:12 crc kubenswrapper[4727]: I1001 12:52:12.907862 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cskvh\" (UniqueName: \"kubernetes.io/projected/fa41b431-40d7-43db-a5a0-d05552cda2d1-kube-api-access-cskvh\") pod \"dnsmasq-dns-698758b865-j2xww\" (UID: \"fa41b431-40d7-43db-a5a0-d05552cda2d1\") " pod="openstack/dnsmasq-dns-698758b865-j2xww" Oct 01 12:52:12 crc kubenswrapper[4727]: I1001 12:52:12.908027 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa41b431-40d7-43db-a5a0-d05552cda2d1-dns-svc\") pod \"dnsmasq-dns-698758b865-j2xww\" (UID: \"fa41b431-40d7-43db-a5a0-d05552cda2d1\") " pod="openstack/dnsmasq-dns-698758b865-j2xww" Oct 01 12:52:12 crc kubenswrapper[4727]: I1001 12:52:12.908099 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa41b431-40d7-43db-a5a0-d05552cda2d1-config\") pod \"dnsmasq-dns-698758b865-j2xww\" (UID: \"fa41b431-40d7-43db-a5a0-d05552cda2d1\") " pod="openstack/dnsmasq-dns-698758b865-j2xww" Oct 01 12:52:12 crc kubenswrapper[4727]: I1001 12:52:12.908130 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa41b431-40d7-43db-a5a0-d05552cda2d1-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-j2xww\" (UID: \"fa41b431-40d7-43db-a5a0-d05552cda2d1\") " pod="openstack/dnsmasq-dns-698758b865-j2xww" Oct 01 12:52:12 crc kubenswrapper[4727]: I1001 12:52:12.908232 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa41b431-40d7-43db-a5a0-d05552cda2d1-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-j2xww\" (UID: \"fa41b431-40d7-43db-a5a0-d05552cda2d1\") " pod="openstack/dnsmasq-dns-698758b865-j2xww" Oct 01 12:52:13 crc kubenswrapper[4727]: I1001 12:52:13.009804 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cskvh\" (UniqueName: \"kubernetes.io/projected/fa41b431-40d7-43db-a5a0-d05552cda2d1-kube-api-access-cskvh\") pod \"dnsmasq-dns-698758b865-j2xww\" (UID: \"fa41b431-40d7-43db-a5a0-d05552cda2d1\") " pod="openstack/dnsmasq-dns-698758b865-j2xww" Oct 01 12:52:13 crc kubenswrapper[4727]: I1001 12:52:13.009894 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa41b431-40d7-43db-a5a0-d05552cda2d1-dns-svc\") pod \"dnsmasq-dns-698758b865-j2xww\" (UID: \"fa41b431-40d7-43db-a5a0-d05552cda2d1\") " pod="openstack/dnsmasq-dns-698758b865-j2xww" Oct 01 12:52:13 crc kubenswrapper[4727]: I1001 12:52:13.009930 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa41b431-40d7-43db-a5a0-d05552cda2d1-config\") pod \"dnsmasq-dns-698758b865-j2xww\" (UID: \"fa41b431-40d7-43db-a5a0-d05552cda2d1\") " pod="openstack/dnsmasq-dns-698758b865-j2xww" Oct 01 12:52:13 crc kubenswrapper[4727]: I1001 12:52:13.009958 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa41b431-40d7-43db-a5a0-d05552cda2d1-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-j2xww\" (UID: \"fa41b431-40d7-43db-a5a0-d05552cda2d1\") " pod="openstack/dnsmasq-dns-698758b865-j2xww" Oct 01 12:52:13 crc kubenswrapper[4727]: I1001 12:52:13.010034 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa41b431-40d7-43db-a5a0-d05552cda2d1-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-j2xww\" (UID: \"fa41b431-40d7-43db-a5a0-d05552cda2d1\") " pod="openstack/dnsmasq-dns-698758b865-j2xww" Oct 01 12:52:13 crc kubenswrapper[4727]: I1001 12:52:13.011129 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa41b431-40d7-43db-a5a0-d05552cda2d1-dns-svc\") pod \"dnsmasq-dns-698758b865-j2xww\" (UID: \"fa41b431-40d7-43db-a5a0-d05552cda2d1\") " pod="openstack/dnsmasq-dns-698758b865-j2xww" Oct 01 12:52:13 crc kubenswrapper[4727]: I1001 12:52:13.011212 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa41b431-40d7-43db-a5a0-d05552cda2d1-config\") pod \"dnsmasq-dns-698758b865-j2xww\" (UID: \"fa41b431-40d7-43db-a5a0-d05552cda2d1\") " pod="openstack/dnsmasq-dns-698758b865-j2xww" Oct 01 12:52:13 crc kubenswrapper[4727]: I1001 12:52:13.011306 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa41b431-40d7-43db-a5a0-d05552cda2d1-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-j2xww\" (UID: \"fa41b431-40d7-43db-a5a0-d05552cda2d1\") " pod="openstack/dnsmasq-dns-698758b865-j2xww" Oct 01 12:52:13 crc kubenswrapper[4727]: I1001 12:52:13.011448 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa41b431-40d7-43db-a5a0-d05552cda2d1-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-j2xww\" (UID: \"fa41b431-40d7-43db-a5a0-d05552cda2d1\") " pod="openstack/dnsmasq-dns-698758b865-j2xww" Oct 01 12:52:13 crc kubenswrapper[4727]: I1001 12:52:13.035633 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cskvh\" (UniqueName: \"kubernetes.io/projected/fa41b431-40d7-43db-a5a0-d05552cda2d1-kube-api-access-cskvh\") pod \"dnsmasq-dns-698758b865-j2xww\" (UID: \"fa41b431-40d7-43db-a5a0-d05552cda2d1\") " pod="openstack/dnsmasq-dns-698758b865-j2xww" Oct 01 12:52:13 crc kubenswrapper[4727]: I1001 12:52:13.086816 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-j2xww" Oct 01 12:52:13 crc kubenswrapper[4727]: I1001 12:52:13.533558 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-j2xww"] Oct 01 12:52:13 crc kubenswrapper[4727]: W1001 12:52:13.542789 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa41b431_40d7_43db_a5a0_d05552cda2d1.slice/crio-dd42968034e535b441249f4b4484bfed4a9fc0a5ac6772e5220d695d21b45ce2 WatchSource:0}: Error finding container dd42968034e535b441249f4b4484bfed4a9fc0a5ac6772e5220d695d21b45ce2: Status 404 returned error can't find the container with id dd42968034e535b441249f4b4484bfed4a9fc0a5ac6772e5220d695d21b45ce2 Oct 01 12:52:13 crc kubenswrapper[4727]: I1001 12:52:13.906078 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 01 12:52:13 crc kubenswrapper[4727]: I1001 12:52:13.912405 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 01 12:52:13 crc kubenswrapper[4727]: I1001 12:52:13.919484 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 01 12:52:13 crc kubenswrapper[4727]: I1001 12:52:13.919700 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 01 12:52:13 crc kubenswrapper[4727]: I1001 12:52:13.919835 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 01 12:52:13 crc kubenswrapper[4727]: I1001 12:52:13.919887 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-6cf88" Oct 01 12:52:13 crc kubenswrapper[4727]: I1001 12:52:13.923692 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 01 12:52:13 crc kubenswrapper[4727]: I1001 12:52:13.990571 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-j2xww" event={"ID":"fa41b431-40d7-43db-a5a0-d05552cda2d1","Type":"ContainerStarted","Data":"0a063a6ae0e10474a4db8dabe6601c2ec7bac7eb5e98703dc6e80593d3d9ffe5"} Oct 01 12:52:13 crc kubenswrapper[4727]: I1001 12:52:13.990627 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-j2xww" event={"ID":"fa41b431-40d7-43db-a5a0-d05552cda2d1","Type":"ContainerStarted","Data":"dd42968034e535b441249f4b4484bfed4a9fc0a5ac6772e5220d695d21b45ce2"} Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.027269 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"d4f71a40-0089-4219-9ff4-837dfaf28b74\") " pod="openstack/swift-storage-0" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.027652 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d4f71a40-0089-4219-9ff4-837dfaf28b74-cache\") pod \"swift-storage-0\" (UID: \"d4f71a40-0089-4219-9ff4-837dfaf28b74\") " pod="openstack/swift-storage-0" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.027824 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d4f71a40-0089-4219-9ff4-837dfaf28b74-lock\") pod \"swift-storage-0\" (UID: \"d4f71a40-0089-4219-9ff4-837dfaf28b74\") " pod="openstack/swift-storage-0" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.027851 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99gwr\" (UniqueName: \"kubernetes.io/projected/d4f71a40-0089-4219-9ff4-837dfaf28b74-kube-api-access-99gwr\") pod \"swift-storage-0\" (UID: \"d4f71a40-0089-4219-9ff4-837dfaf28b74\") " pod="openstack/swift-storage-0" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.027902 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d4f71a40-0089-4219-9ff4-837dfaf28b74-etc-swift\") pod \"swift-storage-0\" (UID: \"d4f71a40-0089-4219-9ff4-837dfaf28b74\") " pod="openstack/swift-storage-0" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.129083 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d4f71a40-0089-4219-9ff4-837dfaf28b74-lock\") pod \"swift-storage-0\" (UID: \"d4f71a40-0089-4219-9ff4-837dfaf28b74\") " pod="openstack/swift-storage-0" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.129136 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99gwr\" (UniqueName: \"kubernetes.io/projected/d4f71a40-0089-4219-9ff4-837dfaf28b74-kube-api-access-99gwr\") pod \"swift-storage-0\" (UID: \"d4f71a40-0089-4219-9ff4-837dfaf28b74\") " pod="openstack/swift-storage-0" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.129174 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d4f71a40-0089-4219-9ff4-837dfaf28b74-etc-swift\") pod \"swift-storage-0\" (UID: \"d4f71a40-0089-4219-9ff4-837dfaf28b74\") " pod="openstack/swift-storage-0" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.129272 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"d4f71a40-0089-4219-9ff4-837dfaf28b74\") " pod="openstack/swift-storage-0" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.129333 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d4f71a40-0089-4219-9ff4-837dfaf28b74-cache\") pod \"swift-storage-0\" (UID: \"d4f71a40-0089-4219-9ff4-837dfaf28b74\") " pod="openstack/swift-storage-0" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.129798 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d4f71a40-0089-4219-9ff4-837dfaf28b74-lock\") pod \"swift-storage-0\" (UID: \"d4f71a40-0089-4219-9ff4-837dfaf28b74\") " pod="openstack/swift-storage-0" Oct 01 12:52:14 crc kubenswrapper[4727]: E1001 12:52:14.129924 4727 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.129819 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d4f71a40-0089-4219-9ff4-837dfaf28b74-cache\") pod \"swift-storage-0\" (UID: \"d4f71a40-0089-4219-9ff4-837dfaf28b74\") " pod="openstack/swift-storage-0" Oct 01 12:52:14 crc kubenswrapper[4727]: E1001 12:52:14.129951 4727 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 12:52:14 crc kubenswrapper[4727]: E1001 12:52:14.130039 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d4f71a40-0089-4219-9ff4-837dfaf28b74-etc-swift podName:d4f71a40-0089-4219-9ff4-837dfaf28b74 nodeName:}" failed. No retries permitted until 2025-10-01 12:52:14.63001828 +0000 UTC m=+912.951373177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d4f71a40-0089-4219-9ff4-837dfaf28b74-etc-swift") pod "swift-storage-0" (UID: "d4f71a40-0089-4219-9ff4-837dfaf28b74") : configmap "swift-ring-files" not found Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.129936 4727 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"d4f71a40-0089-4219-9ff4-837dfaf28b74\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/swift-storage-0" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.153351 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"d4f71a40-0089-4219-9ff4-837dfaf28b74\") " pod="openstack/swift-storage-0" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.155398 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99gwr\" (UniqueName: \"kubernetes.io/projected/d4f71a40-0089-4219-9ff4-837dfaf28b74-kube-api-access-99gwr\") pod \"swift-storage-0\" (UID: \"d4f71a40-0089-4219-9ff4-837dfaf28b74\") " pod="openstack/swift-storage-0" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.333366 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-jwf2c"] Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.334410 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jwf2c" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.336689 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.336924 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.337089 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.350916 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-jwf2c"] Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.433206 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94f8ea93-8124-4473-9a83-c70e83c642f0-combined-ca-bundle\") pod \"swift-ring-rebalance-jwf2c\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " pod="openstack/swift-ring-rebalance-jwf2c" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.433275 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94f8ea93-8124-4473-9a83-c70e83c642f0-scripts\") pod \"swift-ring-rebalance-jwf2c\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " pod="openstack/swift-ring-rebalance-jwf2c" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.433380 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/94f8ea93-8124-4473-9a83-c70e83c642f0-etc-swift\") pod \"swift-ring-rebalance-jwf2c\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " pod="openstack/swift-ring-rebalance-jwf2c" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.433409 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/94f8ea93-8124-4473-9a83-c70e83c642f0-swiftconf\") pod \"swift-ring-rebalance-jwf2c\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " pod="openstack/swift-ring-rebalance-jwf2c" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.433591 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhxkt\" (UniqueName: \"kubernetes.io/projected/94f8ea93-8124-4473-9a83-c70e83c642f0-kube-api-access-bhxkt\") pod \"swift-ring-rebalance-jwf2c\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " pod="openstack/swift-ring-rebalance-jwf2c" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.433661 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/94f8ea93-8124-4473-9a83-c70e83c642f0-ring-data-devices\") pod \"swift-ring-rebalance-jwf2c\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " pod="openstack/swift-ring-rebalance-jwf2c" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.433703 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/94f8ea93-8124-4473-9a83-c70e83c642f0-dispersionconf\") pod \"swift-ring-rebalance-jwf2c\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " pod="openstack/swift-ring-rebalance-jwf2c" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.535486 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94f8ea93-8124-4473-9a83-c70e83c642f0-scripts\") pod \"swift-ring-rebalance-jwf2c\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " pod="openstack/swift-ring-rebalance-jwf2c" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.535905 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/94f8ea93-8124-4473-9a83-c70e83c642f0-etc-swift\") pod \"swift-ring-rebalance-jwf2c\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " pod="openstack/swift-ring-rebalance-jwf2c" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.535946 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/94f8ea93-8124-4473-9a83-c70e83c642f0-swiftconf\") pod \"swift-ring-rebalance-jwf2c\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " pod="openstack/swift-ring-rebalance-jwf2c" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.536103 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhxkt\" (UniqueName: \"kubernetes.io/projected/94f8ea93-8124-4473-9a83-c70e83c642f0-kube-api-access-bhxkt\") pod \"swift-ring-rebalance-jwf2c\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " pod="openstack/swift-ring-rebalance-jwf2c" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.536149 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/94f8ea93-8124-4473-9a83-c70e83c642f0-ring-data-devices\") pod \"swift-ring-rebalance-jwf2c\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " pod="openstack/swift-ring-rebalance-jwf2c" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.536180 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/94f8ea93-8124-4473-9a83-c70e83c642f0-dispersionconf\") pod \"swift-ring-rebalance-jwf2c\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " pod="openstack/swift-ring-rebalance-jwf2c" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.536206 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94f8ea93-8124-4473-9a83-c70e83c642f0-combined-ca-bundle\") pod \"swift-ring-rebalance-jwf2c\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " pod="openstack/swift-ring-rebalance-jwf2c" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.536399 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94f8ea93-8124-4473-9a83-c70e83c642f0-scripts\") pod \"swift-ring-rebalance-jwf2c\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " pod="openstack/swift-ring-rebalance-jwf2c" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.536823 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/94f8ea93-8124-4473-9a83-c70e83c642f0-etc-swift\") pod \"swift-ring-rebalance-jwf2c\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " pod="openstack/swift-ring-rebalance-jwf2c" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.537149 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/94f8ea93-8124-4473-9a83-c70e83c642f0-ring-data-devices\") pod \"swift-ring-rebalance-jwf2c\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " pod="openstack/swift-ring-rebalance-jwf2c" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.542579 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/94f8ea93-8124-4473-9a83-c70e83c642f0-dispersionconf\") pod \"swift-ring-rebalance-jwf2c\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " pod="openstack/swift-ring-rebalance-jwf2c" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.542785 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/94f8ea93-8124-4473-9a83-c70e83c642f0-swiftconf\") pod \"swift-ring-rebalance-jwf2c\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " pod="openstack/swift-ring-rebalance-jwf2c" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.543057 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94f8ea93-8124-4473-9a83-c70e83c642f0-combined-ca-bundle\") pod \"swift-ring-rebalance-jwf2c\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " pod="openstack/swift-ring-rebalance-jwf2c" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.556780 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhxkt\" (UniqueName: \"kubernetes.io/projected/94f8ea93-8124-4473-9a83-c70e83c642f0-kube-api-access-bhxkt\") pod \"swift-ring-rebalance-jwf2c\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " pod="openstack/swift-ring-rebalance-jwf2c" Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.637705 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d4f71a40-0089-4219-9ff4-837dfaf28b74-etc-swift\") pod \"swift-storage-0\" (UID: \"d4f71a40-0089-4219-9ff4-837dfaf28b74\") " pod="openstack/swift-storage-0" Oct 01 12:52:14 crc kubenswrapper[4727]: E1001 12:52:14.638147 4727 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 12:52:14 crc kubenswrapper[4727]: E1001 12:52:14.638273 4727 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 12:52:14 crc kubenswrapper[4727]: E1001 12:52:14.638412 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d4f71a40-0089-4219-9ff4-837dfaf28b74-etc-swift podName:d4f71a40-0089-4219-9ff4-837dfaf28b74 nodeName:}" failed. No retries permitted until 2025-10-01 12:52:15.638390489 +0000 UTC m=+913.959745327 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d4f71a40-0089-4219-9ff4-837dfaf28b74-etc-swift") pod "swift-storage-0" (UID: "d4f71a40-0089-4219-9ff4-837dfaf28b74") : configmap "swift-ring-files" not found Oct 01 12:52:14 crc kubenswrapper[4727]: I1001 12:52:14.665819 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jwf2c" Oct 01 12:52:15 crc kubenswrapper[4727]: I1001 12:52:15.001265 4727 generic.go:334] "Generic (PLEG): container finished" podID="fa41b431-40d7-43db-a5a0-d05552cda2d1" containerID="0a063a6ae0e10474a4db8dabe6601c2ec7bac7eb5e98703dc6e80593d3d9ffe5" exitCode=0 Oct 01 12:52:15 crc kubenswrapper[4727]: I1001 12:52:15.001570 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-j2xww" event={"ID":"fa41b431-40d7-43db-a5a0-d05552cda2d1","Type":"ContainerDied","Data":"0a063a6ae0e10474a4db8dabe6601c2ec7bac7eb5e98703dc6e80593d3d9ffe5"} Oct 01 12:52:15 crc kubenswrapper[4727]: I1001 12:52:15.100901 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-jwf2c"] Oct 01 12:52:15 crc kubenswrapper[4727]: W1001 12:52:15.104359 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94f8ea93_8124_4473_9a83_c70e83c642f0.slice/crio-5a72a5a39993d3fb5a16b56051cd4e46790c7e7a9677fdc53e734083df98189f WatchSource:0}: Error finding container 5a72a5a39993d3fb5a16b56051cd4e46790c7e7a9677fdc53e734083df98189f: Status 404 returned error can't find the container with id 5a72a5a39993d3fb5a16b56051cd4e46790c7e7a9677fdc53e734083df98189f Oct 01 12:52:15 crc kubenswrapper[4727]: I1001 12:52:15.652467 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d4f71a40-0089-4219-9ff4-837dfaf28b74-etc-swift\") pod \"swift-storage-0\" (UID: \"d4f71a40-0089-4219-9ff4-837dfaf28b74\") " pod="openstack/swift-storage-0" Oct 01 12:52:15 crc kubenswrapper[4727]: E1001 12:52:15.652699 4727 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 12:52:15 crc kubenswrapper[4727]: E1001 12:52:15.653284 4727 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 12:52:15 crc kubenswrapper[4727]: E1001 12:52:15.653346 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d4f71a40-0089-4219-9ff4-837dfaf28b74-etc-swift podName:d4f71a40-0089-4219-9ff4-837dfaf28b74 nodeName:}" failed. No retries permitted until 2025-10-01 12:52:17.653327534 +0000 UTC m=+915.974682371 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d4f71a40-0089-4219-9ff4-837dfaf28b74-etc-swift") pod "swift-storage-0" (UID: "d4f71a40-0089-4219-9ff4-837dfaf28b74") : configmap "swift-ring-files" not found Oct 01 12:52:16 crc kubenswrapper[4727]: I1001 12:52:16.011563 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-j2xww" event={"ID":"fa41b431-40d7-43db-a5a0-d05552cda2d1","Type":"ContainerStarted","Data":"efe72741183f4e53e906a8d21a4524dd4a11d50d21ac582f95be94e343ad82bc"} Oct 01 12:52:16 crc kubenswrapper[4727]: I1001 12:52:16.012971 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jwf2c" event={"ID":"94f8ea93-8124-4473-9a83-c70e83c642f0","Type":"ContainerStarted","Data":"5a72a5a39993d3fb5a16b56051cd4e46790c7e7a9677fdc53e734083df98189f"} Oct 01 12:52:16 crc kubenswrapper[4727]: I1001 12:52:16.028970 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-j2xww" podStartSLOduration=4.028953341 podStartE2EDuration="4.028953341s" podCreationTimestamp="2025-10-01 12:52:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:52:16.0273372 +0000 UTC m=+914.348692047" watchObservedRunningTime="2025-10-01 12:52:16.028953341 +0000 UTC m=+914.350308188" Oct 01 12:52:17 crc kubenswrapper[4727]: I1001 12:52:17.020626 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-j2xww" Oct 01 12:52:17 crc kubenswrapper[4727]: I1001 12:52:17.691219 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d4f71a40-0089-4219-9ff4-837dfaf28b74-etc-swift\") pod \"swift-storage-0\" (UID: \"d4f71a40-0089-4219-9ff4-837dfaf28b74\") " pod="openstack/swift-storage-0" Oct 01 12:52:17 crc kubenswrapper[4727]: E1001 12:52:17.691408 4727 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 12:52:17 crc kubenswrapper[4727]: E1001 12:52:17.691443 4727 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 12:52:17 crc kubenswrapper[4727]: E1001 12:52:17.691509 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d4f71a40-0089-4219-9ff4-837dfaf28b74-etc-swift podName:d4f71a40-0089-4219-9ff4-837dfaf28b74 nodeName:}" failed. No retries permitted until 2025-10-01 12:52:21.691480031 +0000 UTC m=+920.012834868 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d4f71a40-0089-4219-9ff4-837dfaf28b74-etc-swift") pod "swift-storage-0" (UID: "d4f71a40-0089-4219-9ff4-837dfaf28b74") : configmap "swift-ring-files" not found Oct 01 12:52:20 crc kubenswrapper[4727]: I1001 12:52:20.573815 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-drxkp"] Oct 01 12:52:20 crc kubenswrapper[4727]: I1001 12:52:20.575673 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-drxkp" Oct 01 12:52:20 crc kubenswrapper[4727]: I1001 12:52:20.583099 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-drxkp"] Oct 01 12:52:20 crc kubenswrapper[4727]: I1001 12:52:20.647432 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh7r4\" (UniqueName: \"kubernetes.io/projected/fee065f1-f9a2-43bd-ae70-94d196555b5f-kube-api-access-dh7r4\") pod \"keystone-db-create-drxkp\" (UID: \"fee065f1-f9a2-43bd-ae70-94d196555b5f\") " pod="openstack/keystone-db-create-drxkp" Oct 01 12:52:20 crc kubenswrapper[4727]: I1001 12:52:20.700251 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 01 12:52:20 crc kubenswrapper[4727]: I1001 12:52:20.749519 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh7r4\" (UniqueName: \"kubernetes.io/projected/fee065f1-f9a2-43bd-ae70-94d196555b5f-kube-api-access-dh7r4\") pod \"keystone-db-create-drxkp\" (UID: \"fee065f1-f9a2-43bd-ae70-94d196555b5f\") " pod="openstack/keystone-db-create-drxkp" Oct 01 12:52:20 crc kubenswrapper[4727]: I1001 12:52:20.780892 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh7r4\" (UniqueName: \"kubernetes.io/projected/fee065f1-f9a2-43bd-ae70-94d196555b5f-kube-api-access-dh7r4\") pod \"keystone-db-create-drxkp\" (UID: \"fee065f1-f9a2-43bd-ae70-94d196555b5f\") " pod="openstack/keystone-db-create-drxkp" Oct 01 12:52:20 crc kubenswrapper[4727]: I1001 12:52:20.803391 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-5zh6t"] Oct 01 12:52:20 crc kubenswrapper[4727]: I1001 12:52:20.804832 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5zh6t" Oct 01 12:52:20 crc kubenswrapper[4727]: I1001 12:52:20.814085 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-5zh6t"] Oct 01 12:52:20 crc kubenswrapper[4727]: I1001 12:52:20.851050 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x46dr\" (UniqueName: \"kubernetes.io/projected/c449afe5-f791-4d64-9d5d-f3222f7a9f40-kube-api-access-x46dr\") pod \"placement-db-create-5zh6t\" (UID: \"c449afe5-f791-4d64-9d5d-f3222f7a9f40\") " pod="openstack/placement-db-create-5zh6t" Oct 01 12:52:20 crc kubenswrapper[4727]: I1001 12:52:20.898925 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-drxkp" Oct 01 12:52:20 crc kubenswrapper[4727]: I1001 12:52:20.953013 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x46dr\" (UniqueName: \"kubernetes.io/projected/c449afe5-f791-4d64-9d5d-f3222f7a9f40-kube-api-access-x46dr\") pod \"placement-db-create-5zh6t\" (UID: \"c449afe5-f791-4d64-9d5d-f3222f7a9f40\") " pod="openstack/placement-db-create-5zh6t" Oct 01 12:52:20 crc kubenswrapper[4727]: I1001 12:52:20.973818 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x46dr\" (UniqueName: \"kubernetes.io/projected/c449afe5-f791-4d64-9d5d-f3222f7a9f40-kube-api-access-x46dr\") pod \"placement-db-create-5zh6t\" (UID: \"c449afe5-f791-4d64-9d5d-f3222f7a9f40\") " pod="openstack/placement-db-create-5zh6t" Oct 01 12:52:21 crc kubenswrapper[4727]: I1001 12:52:21.026906 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-v4zp9"] Oct 01 12:52:21 crc kubenswrapper[4727]: I1001 12:52:21.028522 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v4zp9" Oct 01 12:52:21 crc kubenswrapper[4727]: I1001 12:52:21.053533 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-v4zp9"] Oct 01 12:52:21 crc kubenswrapper[4727]: I1001 12:52:21.058022 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jwf2c" event={"ID":"94f8ea93-8124-4473-9a83-c70e83c642f0","Type":"ContainerStarted","Data":"9392f26eb8298e829c7f05110051ea11819d2f254aa10bae5d860b888810e57f"} Oct 01 12:52:21 crc kubenswrapper[4727]: I1001 12:52:21.079143 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-jwf2c" podStartSLOduration=2.256276335 podStartE2EDuration="7.079123355s" podCreationTimestamp="2025-10-01 12:52:14 +0000 UTC" firstStartedPulling="2025-10-01 12:52:15.106634886 +0000 UTC m=+913.427989723" lastFinishedPulling="2025-10-01 12:52:19.929481896 +0000 UTC m=+918.250836743" observedRunningTime="2025-10-01 12:52:21.075227132 +0000 UTC m=+919.396581979" watchObservedRunningTime="2025-10-01 12:52:21.079123355 +0000 UTC m=+919.400478192" Oct 01 12:52:21 crc kubenswrapper[4727]: I1001 12:52:21.139619 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5zh6t" Oct 01 12:52:21 crc kubenswrapper[4727]: I1001 12:52:21.168703 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds8tp\" (UniqueName: \"kubernetes.io/projected/d250eeab-d323-428b-95de-ff6d859ee48b-kube-api-access-ds8tp\") pod \"glance-db-create-v4zp9\" (UID: \"d250eeab-d323-428b-95de-ff6d859ee48b\") " pod="openstack/glance-db-create-v4zp9" Oct 01 12:52:21 crc kubenswrapper[4727]: I1001 12:52:21.270745 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds8tp\" (UniqueName: \"kubernetes.io/projected/d250eeab-d323-428b-95de-ff6d859ee48b-kube-api-access-ds8tp\") pod \"glance-db-create-v4zp9\" (UID: \"d250eeab-d323-428b-95de-ff6d859ee48b\") " pod="openstack/glance-db-create-v4zp9" Oct 01 12:52:21 crc kubenswrapper[4727]: I1001 12:52:21.308230 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds8tp\" (UniqueName: \"kubernetes.io/projected/d250eeab-d323-428b-95de-ff6d859ee48b-kube-api-access-ds8tp\") pod \"glance-db-create-v4zp9\" (UID: \"d250eeab-d323-428b-95de-ff6d859ee48b\") " pod="openstack/glance-db-create-v4zp9" Oct 01 12:52:21 crc kubenswrapper[4727]: I1001 12:52:21.358779 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-drxkp"] Oct 01 12:52:21 crc kubenswrapper[4727]: I1001 12:52:21.371312 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v4zp9" Oct 01 12:52:21 crc kubenswrapper[4727]: I1001 12:52:21.562200 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-5zh6t"] Oct 01 12:52:21 crc kubenswrapper[4727]: W1001 12:52:21.568507 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc449afe5_f791_4d64_9d5d_f3222f7a9f40.slice/crio-396829c2fd149d388e2a2f2534ffbc27f2762c335183df7e53c1161c42f57c1b WatchSource:0}: Error finding container 396829c2fd149d388e2a2f2534ffbc27f2762c335183df7e53c1161c42f57c1b: Status 404 returned error can't find the container with id 396829c2fd149d388e2a2f2534ffbc27f2762c335183df7e53c1161c42f57c1b Oct 01 12:52:21 crc kubenswrapper[4727]: I1001 12:52:21.781116 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d4f71a40-0089-4219-9ff4-837dfaf28b74-etc-swift\") pod \"swift-storage-0\" (UID: \"d4f71a40-0089-4219-9ff4-837dfaf28b74\") " pod="openstack/swift-storage-0" Oct 01 12:52:21 crc kubenswrapper[4727]: E1001 12:52:21.781340 4727 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 12:52:21 crc kubenswrapper[4727]: E1001 12:52:21.781375 4727 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 12:52:21 crc kubenswrapper[4727]: E1001 12:52:21.781450 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d4f71a40-0089-4219-9ff4-837dfaf28b74-etc-swift podName:d4f71a40-0089-4219-9ff4-837dfaf28b74 nodeName:}" failed. No retries permitted until 2025-10-01 12:52:29.781430247 +0000 UTC m=+928.102785084 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d4f71a40-0089-4219-9ff4-837dfaf28b74-etc-swift") pod "swift-storage-0" (UID: "d4f71a40-0089-4219-9ff4-837dfaf28b74") : configmap "swift-ring-files" not found Oct 01 12:52:21 crc kubenswrapper[4727]: I1001 12:52:21.787631 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-v4zp9"] Oct 01 12:52:21 crc kubenswrapper[4727]: W1001 12:52:21.801808 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd250eeab_d323_428b_95de_ff6d859ee48b.slice/crio-d415d5408193ae0e06992fcf3e9c58ed4790403b5fad002f95e6a79eb48d95e9 WatchSource:0}: Error finding container d415d5408193ae0e06992fcf3e9c58ed4790403b5fad002f95e6a79eb48d95e9: Status 404 returned error can't find the container with id d415d5408193ae0e06992fcf3e9c58ed4790403b5fad002f95e6a79eb48d95e9 Oct 01 12:52:22 crc kubenswrapper[4727]: I1001 12:52:22.067491 4727 generic.go:334] "Generic (PLEG): container finished" podID="fee065f1-f9a2-43bd-ae70-94d196555b5f" containerID="82c4031302220502d0e46eca03c679cab70717e6c7b534805db20a4238a158cd" exitCode=0 Oct 01 12:52:22 crc kubenswrapper[4727]: I1001 12:52:22.067563 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-drxkp" event={"ID":"fee065f1-f9a2-43bd-ae70-94d196555b5f","Type":"ContainerDied","Data":"82c4031302220502d0e46eca03c679cab70717e6c7b534805db20a4238a158cd"} Oct 01 12:52:22 crc kubenswrapper[4727]: I1001 12:52:22.067589 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-drxkp" event={"ID":"fee065f1-f9a2-43bd-ae70-94d196555b5f","Type":"ContainerStarted","Data":"7f5c68af067125133b4481ac5d11618f661923bfda006d2f2be7583fac9bd679"} Oct 01 12:52:22 crc kubenswrapper[4727]: I1001 12:52:22.071802 4727 generic.go:334] "Generic (PLEG): container finished" podID="42c8d9a9-fa0f-44c5-9ac1-2361f24c0876" containerID="cf1055988e7ddf033939cb0af9c823e7afdae43301f7bffa7308bfce6e4ca110" exitCode=0 Oct 01 12:52:22 crc kubenswrapper[4727]: I1001 12:52:22.071897 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876","Type":"ContainerDied","Data":"cf1055988e7ddf033939cb0af9c823e7afdae43301f7bffa7308bfce6e4ca110"} Oct 01 12:52:22 crc kubenswrapper[4727]: I1001 12:52:22.074368 4727 generic.go:334] "Generic (PLEG): container finished" podID="74ad068e-3c83-4fd2-af0a-7e45cd945411" containerID="f86c783bf5387b32ebbd44f58fac153a5bb6b813b9d45eab24f2dd3b42107af2" exitCode=0 Oct 01 12:52:22 crc kubenswrapper[4727]: I1001 12:52:22.074443 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74ad068e-3c83-4fd2-af0a-7e45cd945411","Type":"ContainerDied","Data":"f86c783bf5387b32ebbd44f58fac153a5bb6b813b9d45eab24f2dd3b42107af2"} Oct 01 12:52:22 crc kubenswrapper[4727]: I1001 12:52:22.082834 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-v4zp9" event={"ID":"d250eeab-d323-428b-95de-ff6d859ee48b","Type":"ContainerStarted","Data":"4d095756fe013c4991248c3b64acdbc2cadcd44c4b3611d8467380c33f620472"} Oct 01 12:52:22 crc kubenswrapper[4727]: I1001 12:52:22.082950 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-v4zp9" event={"ID":"d250eeab-d323-428b-95de-ff6d859ee48b","Type":"ContainerStarted","Data":"d415d5408193ae0e06992fcf3e9c58ed4790403b5fad002f95e6a79eb48d95e9"} Oct 01 12:52:22 crc kubenswrapper[4727]: I1001 12:52:22.087151 4727 generic.go:334] "Generic (PLEG): container finished" podID="c449afe5-f791-4d64-9d5d-f3222f7a9f40" containerID="9a1f32e45bd50fa3cc563f87b03d0aa1f5f0c9f467e0658706d1657118a6ff31" exitCode=0 Oct 01 12:52:22 crc kubenswrapper[4727]: I1001 12:52:22.087291 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5zh6t" event={"ID":"c449afe5-f791-4d64-9d5d-f3222f7a9f40","Type":"ContainerDied","Data":"9a1f32e45bd50fa3cc563f87b03d0aa1f5f0c9f467e0658706d1657118a6ff31"} Oct 01 12:52:22 crc kubenswrapper[4727]: I1001 12:52:22.087353 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5zh6t" event={"ID":"c449afe5-f791-4d64-9d5d-f3222f7a9f40","Type":"ContainerStarted","Data":"396829c2fd149d388e2a2f2534ffbc27f2762c335183df7e53c1161c42f57c1b"} Oct 01 12:52:22 crc kubenswrapper[4727]: I1001 12:52:22.167828 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-v4zp9" podStartSLOduration=1.167806173 podStartE2EDuration="1.167806173s" podCreationTimestamp="2025-10-01 12:52:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:52:22.159290804 +0000 UTC m=+920.480645641" watchObservedRunningTime="2025-10-01 12:52:22.167806173 +0000 UTC m=+920.489161010" Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.089175 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-j2xww" Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.098429 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876","Type":"ContainerStarted","Data":"d4be881e32be1bbc9d14d2cf1ac2c6bd783bdb664cb9a3a02a4b98b201bc72ec"} Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.098644 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.100754 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74ad068e-3c83-4fd2-af0a-7e45cd945411","Type":"ContainerStarted","Data":"8a81300adaa1a79c48be60f260d6b2c72adcabfd8abff87258b84d4734083da2"} Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.101009 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.102484 4727 generic.go:334] "Generic (PLEG): container finished" podID="d250eeab-d323-428b-95de-ff6d859ee48b" containerID="4d095756fe013c4991248c3b64acdbc2cadcd44c4b3611d8467380c33f620472" exitCode=0 Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.102535 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-v4zp9" event={"ID":"d250eeab-d323-428b-95de-ff6d859ee48b","Type":"ContainerDied","Data":"4d095756fe013c4991248c3b64acdbc2cadcd44c4b3611d8467380c33f620472"} Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.163800 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-rhqzr"] Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.164090 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" podUID="c3c6c756-7f34-4dad-a114-9e080ec40524" containerName="dnsmasq-dns" containerID="cri-o://9f636215c2cd2f73135c252052e48e0d8a1981981be761fcbe6bff1cbcba7277" gracePeriod=10 Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.195493 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371969.65931 podStartE2EDuration="1m7.195465597s" podCreationTimestamp="2025-10-01 12:51:16 +0000 UTC" firstStartedPulling="2025-10-01 12:51:18.705515477 +0000 UTC m=+857.026870314" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:52:23.181134975 +0000 UTC m=+921.502489822" watchObservedRunningTime="2025-10-01 12:52:23.195465597 +0000 UTC m=+921.516820434" Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.214869 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.065921277 podStartE2EDuration="1m7.214854248s" podCreationTimestamp="2025-10-01 12:51:16 +0000 UTC" firstStartedPulling="2025-10-01 12:51:18.244234531 +0000 UTC m=+856.565589368" lastFinishedPulling="2025-10-01 12:51:48.393167502 +0000 UTC m=+886.714522339" observedRunningTime="2025-10-01 12:52:23.213305039 +0000 UTC m=+921.534659896" watchObservedRunningTime="2025-10-01 12:52:23.214854248 +0000 UTC m=+921.536209085" Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.543792 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-drxkp" Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.618659 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dh7r4\" (UniqueName: \"kubernetes.io/projected/fee065f1-f9a2-43bd-ae70-94d196555b5f-kube-api-access-dh7r4\") pod \"fee065f1-f9a2-43bd-ae70-94d196555b5f\" (UID: \"fee065f1-f9a2-43bd-ae70-94d196555b5f\") " Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.624215 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fee065f1-f9a2-43bd-ae70-94d196555b5f-kube-api-access-dh7r4" (OuterVolumeSpecName: "kube-api-access-dh7r4") pod "fee065f1-f9a2-43bd-ae70-94d196555b5f" (UID: "fee065f1-f9a2-43bd-ae70-94d196555b5f"). InnerVolumeSpecName "kube-api-access-dh7r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.632835 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5zh6t" Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.651866 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.720067 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3c6c756-7f34-4dad-a114-9e080ec40524-ovsdbserver-sb\") pod \"c3c6c756-7f34-4dad-a114-9e080ec40524\" (UID: \"c3c6c756-7f34-4dad-a114-9e080ec40524\") " Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.720416 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3c6c756-7f34-4dad-a114-9e080ec40524-ovsdbserver-nb\") pod \"c3c6c756-7f34-4dad-a114-9e080ec40524\" (UID: \"c3c6c756-7f34-4dad-a114-9e080ec40524\") " Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.720480 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64w2n\" (UniqueName: \"kubernetes.io/projected/c3c6c756-7f34-4dad-a114-9e080ec40524-kube-api-access-64w2n\") pod \"c3c6c756-7f34-4dad-a114-9e080ec40524\" (UID: \"c3c6c756-7f34-4dad-a114-9e080ec40524\") " Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.720502 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x46dr\" (UniqueName: \"kubernetes.io/projected/c449afe5-f791-4d64-9d5d-f3222f7a9f40-kube-api-access-x46dr\") pod \"c449afe5-f791-4d64-9d5d-f3222f7a9f40\" (UID: \"c449afe5-f791-4d64-9d5d-f3222f7a9f40\") " Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.720563 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3c6c756-7f34-4dad-a114-9e080ec40524-dns-svc\") pod \"c3c6c756-7f34-4dad-a114-9e080ec40524\" (UID: \"c3c6c756-7f34-4dad-a114-9e080ec40524\") " Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.720632 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3c6c756-7f34-4dad-a114-9e080ec40524-config\") pod \"c3c6c756-7f34-4dad-a114-9e080ec40524\" (UID: \"c3c6c756-7f34-4dad-a114-9e080ec40524\") " Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.720979 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dh7r4\" (UniqueName: \"kubernetes.io/projected/fee065f1-f9a2-43bd-ae70-94d196555b5f-kube-api-access-dh7r4\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.724989 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c449afe5-f791-4d64-9d5d-f3222f7a9f40-kube-api-access-x46dr" (OuterVolumeSpecName: "kube-api-access-x46dr") pod "c449afe5-f791-4d64-9d5d-f3222f7a9f40" (UID: "c449afe5-f791-4d64-9d5d-f3222f7a9f40"). InnerVolumeSpecName "kube-api-access-x46dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.726032 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3c6c756-7f34-4dad-a114-9e080ec40524-kube-api-access-64w2n" (OuterVolumeSpecName: "kube-api-access-64w2n") pod "c3c6c756-7f34-4dad-a114-9e080ec40524" (UID: "c3c6c756-7f34-4dad-a114-9e080ec40524"). InnerVolumeSpecName "kube-api-access-64w2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.765306 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3c6c756-7f34-4dad-a114-9e080ec40524-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c3c6c756-7f34-4dad-a114-9e080ec40524" (UID: "c3c6c756-7f34-4dad-a114-9e080ec40524"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.769708 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3c6c756-7f34-4dad-a114-9e080ec40524-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c3c6c756-7f34-4dad-a114-9e080ec40524" (UID: "c3c6c756-7f34-4dad-a114-9e080ec40524"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.770074 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3c6c756-7f34-4dad-a114-9e080ec40524-config" (OuterVolumeSpecName: "config") pod "c3c6c756-7f34-4dad-a114-9e080ec40524" (UID: "c3c6c756-7f34-4dad-a114-9e080ec40524"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.778543 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3c6c756-7f34-4dad-a114-9e080ec40524-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c3c6c756-7f34-4dad-a114-9e080ec40524" (UID: "c3c6c756-7f34-4dad-a114-9e080ec40524"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.822396 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64w2n\" (UniqueName: \"kubernetes.io/projected/c3c6c756-7f34-4dad-a114-9e080ec40524-kube-api-access-64w2n\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.822443 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x46dr\" (UniqueName: \"kubernetes.io/projected/c449afe5-f791-4d64-9d5d-f3222f7a9f40-kube-api-access-x46dr\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.822456 4727 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3c6c756-7f34-4dad-a114-9e080ec40524-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.822466 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3c6c756-7f34-4dad-a114-9e080ec40524-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.822479 4727 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3c6c756-7f34-4dad-a114-9e080ec40524-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:23 crc kubenswrapper[4727]: I1001 12:52:23.822488 4727 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3c6c756-7f34-4dad-a114-9e080ec40524-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:24 crc kubenswrapper[4727]: I1001 12:52:24.109985 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5zh6t" Oct 01 12:52:24 crc kubenswrapper[4727]: I1001 12:52:24.110011 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5zh6t" event={"ID":"c449afe5-f791-4d64-9d5d-f3222f7a9f40","Type":"ContainerDied","Data":"396829c2fd149d388e2a2f2534ffbc27f2762c335183df7e53c1161c42f57c1b"} Oct 01 12:52:24 crc kubenswrapper[4727]: I1001 12:52:24.110043 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="396829c2fd149d388e2a2f2534ffbc27f2762c335183df7e53c1161c42f57c1b" Oct 01 12:52:24 crc kubenswrapper[4727]: I1001 12:52:24.112699 4727 generic.go:334] "Generic (PLEG): container finished" podID="c3c6c756-7f34-4dad-a114-9e080ec40524" containerID="9f636215c2cd2f73135c252052e48e0d8a1981981be761fcbe6bff1cbcba7277" exitCode=0 Oct 01 12:52:24 crc kubenswrapper[4727]: I1001 12:52:24.112796 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" event={"ID":"c3c6c756-7f34-4dad-a114-9e080ec40524","Type":"ContainerDied","Data":"9f636215c2cd2f73135c252052e48e0d8a1981981be761fcbe6bff1cbcba7277"} Oct 01 12:52:24 crc kubenswrapper[4727]: I1001 12:52:24.112830 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" event={"ID":"c3c6c756-7f34-4dad-a114-9e080ec40524","Type":"ContainerDied","Data":"13b4d49df791acd152222a7b10d91c3560791c5aa287af54711e9c70d1624f82"} Oct 01 12:52:24 crc kubenswrapper[4727]: I1001 12:52:24.112852 4727 scope.go:117] "RemoveContainer" containerID="9f636215c2cd2f73135c252052e48e0d8a1981981be761fcbe6bff1cbcba7277" Oct 01 12:52:24 crc kubenswrapper[4727]: I1001 12:52:24.112811 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-rhqzr" Oct 01 12:52:24 crc kubenswrapper[4727]: I1001 12:52:24.114712 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-drxkp" event={"ID":"fee065f1-f9a2-43bd-ae70-94d196555b5f","Type":"ContainerDied","Data":"7f5c68af067125133b4481ac5d11618f661923bfda006d2f2be7583fac9bd679"} Oct 01 12:52:24 crc kubenswrapper[4727]: I1001 12:52:24.114743 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f5c68af067125133b4481ac5d11618f661923bfda006d2f2be7583fac9bd679" Oct 01 12:52:24 crc kubenswrapper[4727]: I1001 12:52:24.114784 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-drxkp" Oct 01 12:52:24 crc kubenswrapper[4727]: I1001 12:52:24.134955 4727 scope.go:117] "RemoveContainer" containerID="fb09c277734d73aee838d0fe4c5093bcaf07692683a6fe004a6be9082ba7b938" Oct 01 12:52:24 crc kubenswrapper[4727]: I1001 12:52:24.159043 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-rhqzr"] Oct 01 12:52:24 crc kubenswrapper[4727]: I1001 12:52:24.173097 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-rhqzr"] Oct 01 12:52:24 crc kubenswrapper[4727]: I1001 12:52:24.185763 4727 scope.go:117] "RemoveContainer" containerID="9f636215c2cd2f73135c252052e48e0d8a1981981be761fcbe6bff1cbcba7277" Oct 01 12:52:24 crc kubenswrapper[4727]: E1001 12:52:24.186317 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f636215c2cd2f73135c252052e48e0d8a1981981be761fcbe6bff1cbcba7277\": container with ID starting with 9f636215c2cd2f73135c252052e48e0d8a1981981be761fcbe6bff1cbcba7277 not found: ID does not exist" containerID="9f636215c2cd2f73135c252052e48e0d8a1981981be761fcbe6bff1cbcba7277" Oct 01 12:52:24 crc kubenswrapper[4727]: I1001 12:52:24.186391 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f636215c2cd2f73135c252052e48e0d8a1981981be761fcbe6bff1cbcba7277"} err="failed to get container status \"9f636215c2cd2f73135c252052e48e0d8a1981981be761fcbe6bff1cbcba7277\": rpc error: code = NotFound desc = could not find container \"9f636215c2cd2f73135c252052e48e0d8a1981981be761fcbe6bff1cbcba7277\": container with ID starting with 9f636215c2cd2f73135c252052e48e0d8a1981981be761fcbe6bff1cbcba7277 not found: ID does not exist" Oct 01 12:52:24 crc kubenswrapper[4727]: I1001 12:52:24.186423 4727 scope.go:117] "RemoveContainer" containerID="fb09c277734d73aee838d0fe4c5093bcaf07692683a6fe004a6be9082ba7b938" Oct 01 12:52:24 crc kubenswrapper[4727]: E1001 12:52:24.187890 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb09c277734d73aee838d0fe4c5093bcaf07692683a6fe004a6be9082ba7b938\": container with ID starting with fb09c277734d73aee838d0fe4c5093bcaf07692683a6fe004a6be9082ba7b938 not found: ID does not exist" containerID="fb09c277734d73aee838d0fe4c5093bcaf07692683a6fe004a6be9082ba7b938" Oct 01 12:52:24 crc kubenswrapper[4727]: I1001 12:52:24.187934 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb09c277734d73aee838d0fe4c5093bcaf07692683a6fe004a6be9082ba7b938"} err="failed to get container status \"fb09c277734d73aee838d0fe4c5093bcaf07692683a6fe004a6be9082ba7b938\": rpc error: code = NotFound desc = could not find container \"fb09c277734d73aee838d0fe4c5093bcaf07692683a6fe004a6be9082ba7b938\": container with ID starting with fb09c277734d73aee838d0fe4c5093bcaf07692683a6fe004a6be9082ba7b938 not found: ID does not exist" Oct 01 12:52:24 crc kubenswrapper[4727]: I1001 12:52:24.382372 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3c6c756-7f34-4dad-a114-9e080ec40524" path="/var/lib/kubelet/pods/c3c6c756-7f34-4dad-a114-9e080ec40524/volumes" Oct 01 12:52:24 crc kubenswrapper[4727]: I1001 12:52:24.497797 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v4zp9" Oct 01 12:52:24 crc kubenswrapper[4727]: I1001 12:52:24.532984 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ds8tp\" (UniqueName: \"kubernetes.io/projected/d250eeab-d323-428b-95de-ff6d859ee48b-kube-api-access-ds8tp\") pod \"d250eeab-d323-428b-95de-ff6d859ee48b\" (UID: \"d250eeab-d323-428b-95de-ff6d859ee48b\") " Oct 01 12:52:24 crc kubenswrapper[4727]: I1001 12:52:24.538309 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d250eeab-d323-428b-95de-ff6d859ee48b-kube-api-access-ds8tp" (OuterVolumeSpecName: "kube-api-access-ds8tp") pod "d250eeab-d323-428b-95de-ff6d859ee48b" (UID: "d250eeab-d323-428b-95de-ff6d859ee48b"). InnerVolumeSpecName "kube-api-access-ds8tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:52:24 crc kubenswrapper[4727]: I1001 12:52:24.634598 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ds8tp\" (UniqueName: \"kubernetes.io/projected/d250eeab-d323-428b-95de-ff6d859ee48b-kube-api-access-ds8tp\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:25 crc kubenswrapper[4727]: I1001 12:52:25.125247 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-v4zp9" event={"ID":"d250eeab-d323-428b-95de-ff6d859ee48b","Type":"ContainerDied","Data":"d415d5408193ae0e06992fcf3e9c58ed4790403b5fad002f95e6a79eb48d95e9"} Oct 01 12:52:25 crc kubenswrapper[4727]: I1001 12:52:25.125283 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d415d5408193ae0e06992fcf3e9c58ed4790403b5fad002f95e6a79eb48d95e9" Oct 01 12:52:25 crc kubenswrapper[4727]: I1001 12:52:25.125319 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v4zp9" Oct 01 12:52:26 crc kubenswrapper[4727]: I1001 12:52:26.342899 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-v56sx" podUID="fb0c554e-ed3f-4476-9963-dabc0089698d" containerName="ovn-controller" probeResult="failure" output=< Oct 01 12:52:26 crc kubenswrapper[4727]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 01 12:52:26 crc kubenswrapper[4727]: > Oct 01 12:52:29 crc kubenswrapper[4727]: I1001 12:52:29.154861 4727 generic.go:334] "Generic (PLEG): container finished" podID="94f8ea93-8124-4473-9a83-c70e83c642f0" containerID="9392f26eb8298e829c7f05110051ea11819d2f254aa10bae5d860b888810e57f" exitCode=0 Oct 01 12:52:29 crc kubenswrapper[4727]: I1001 12:52:29.154961 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jwf2c" event={"ID":"94f8ea93-8124-4473-9a83-c70e83c642f0","Type":"ContainerDied","Data":"9392f26eb8298e829c7f05110051ea11819d2f254aa10bae5d860b888810e57f"} Oct 01 12:52:29 crc kubenswrapper[4727]: I1001 12:52:29.815031 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d4f71a40-0089-4219-9ff4-837dfaf28b74-etc-swift\") pod \"swift-storage-0\" (UID: \"d4f71a40-0089-4219-9ff4-837dfaf28b74\") " pod="openstack/swift-storage-0" Oct 01 12:52:29 crc kubenswrapper[4727]: I1001 12:52:29.823241 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d4f71a40-0089-4219-9ff4-837dfaf28b74-etc-swift\") pod \"swift-storage-0\" (UID: \"d4f71a40-0089-4219-9ff4-837dfaf28b74\") " pod="openstack/swift-storage-0" Oct 01 12:52:29 crc kubenswrapper[4727]: I1001 12:52:29.843481 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.415878 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.499680 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jwf2c" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.525116 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/94f8ea93-8124-4473-9a83-c70e83c642f0-swiftconf\") pod \"94f8ea93-8124-4473-9a83-c70e83c642f0\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.525153 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhxkt\" (UniqueName: \"kubernetes.io/projected/94f8ea93-8124-4473-9a83-c70e83c642f0-kube-api-access-bhxkt\") pod \"94f8ea93-8124-4473-9a83-c70e83c642f0\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.525204 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/94f8ea93-8124-4473-9a83-c70e83c642f0-dispersionconf\") pod \"94f8ea93-8124-4473-9a83-c70e83c642f0\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.525246 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/94f8ea93-8124-4473-9a83-c70e83c642f0-ring-data-devices\") pod \"94f8ea93-8124-4473-9a83-c70e83c642f0\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.525293 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94f8ea93-8124-4473-9a83-c70e83c642f0-scripts\") pod \"94f8ea93-8124-4473-9a83-c70e83c642f0\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.525373 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/94f8ea93-8124-4473-9a83-c70e83c642f0-etc-swift\") pod \"94f8ea93-8124-4473-9a83-c70e83c642f0\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.525481 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94f8ea93-8124-4473-9a83-c70e83c642f0-combined-ca-bundle\") pod \"94f8ea93-8124-4473-9a83-c70e83c642f0\" (UID: \"94f8ea93-8124-4473-9a83-c70e83c642f0\") " Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.526192 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94f8ea93-8124-4473-9a83-c70e83c642f0-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "94f8ea93-8124-4473-9a83-c70e83c642f0" (UID: "94f8ea93-8124-4473-9a83-c70e83c642f0"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.526537 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94f8ea93-8124-4473-9a83-c70e83c642f0-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "94f8ea93-8124-4473-9a83-c70e83c642f0" (UID: "94f8ea93-8124-4473-9a83-c70e83c642f0"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.540600 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94f8ea93-8124-4473-9a83-c70e83c642f0-kube-api-access-bhxkt" (OuterVolumeSpecName: "kube-api-access-bhxkt") pod "94f8ea93-8124-4473-9a83-c70e83c642f0" (UID: "94f8ea93-8124-4473-9a83-c70e83c642f0"). InnerVolumeSpecName "kube-api-access-bhxkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.543807 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94f8ea93-8124-4473-9a83-c70e83c642f0-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "94f8ea93-8124-4473-9a83-c70e83c642f0" (UID: "94f8ea93-8124-4473-9a83-c70e83c642f0"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.554477 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94f8ea93-8124-4473-9a83-c70e83c642f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94f8ea93-8124-4473-9a83-c70e83c642f0" (UID: "94f8ea93-8124-4473-9a83-c70e83c642f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.554759 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94f8ea93-8124-4473-9a83-c70e83c642f0-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "94f8ea93-8124-4473-9a83-c70e83c642f0" (UID: "94f8ea93-8124-4473-9a83-c70e83c642f0"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.572704 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94f8ea93-8124-4473-9a83-c70e83c642f0-scripts" (OuterVolumeSpecName: "scripts") pod "94f8ea93-8124-4473-9a83-c70e83c642f0" (UID: "94f8ea93-8124-4473-9a83-c70e83c642f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.580651 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c969-account-create-n4qx8"] Oct 01 12:52:30 crc kubenswrapper[4727]: E1001 12:52:30.581345 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94f8ea93-8124-4473-9a83-c70e83c642f0" containerName="swift-ring-rebalance" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.584181 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="94f8ea93-8124-4473-9a83-c70e83c642f0" containerName="swift-ring-rebalance" Oct 01 12:52:30 crc kubenswrapper[4727]: E1001 12:52:30.584458 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d250eeab-d323-428b-95de-ff6d859ee48b" containerName="mariadb-database-create" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.584517 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="d250eeab-d323-428b-95de-ff6d859ee48b" containerName="mariadb-database-create" Oct 01 12:52:30 crc kubenswrapper[4727]: E1001 12:52:30.584609 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3c6c756-7f34-4dad-a114-9e080ec40524" containerName="dnsmasq-dns" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.584658 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3c6c756-7f34-4dad-a114-9e080ec40524" containerName="dnsmasq-dns" Oct 01 12:52:30 crc kubenswrapper[4727]: E1001 12:52:30.584720 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c449afe5-f791-4d64-9d5d-f3222f7a9f40" containerName="mariadb-database-create" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.584771 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="c449afe5-f791-4d64-9d5d-f3222f7a9f40" containerName="mariadb-database-create" Oct 01 12:52:30 crc kubenswrapper[4727]: E1001 12:52:30.584826 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee065f1-f9a2-43bd-ae70-94d196555b5f" containerName="mariadb-database-create" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.584879 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee065f1-f9a2-43bd-ae70-94d196555b5f" containerName="mariadb-database-create" Oct 01 12:52:30 crc kubenswrapper[4727]: E1001 12:52:30.584989 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3c6c756-7f34-4dad-a114-9e080ec40524" containerName="init" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.585082 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3c6c756-7f34-4dad-a114-9e080ec40524" containerName="init" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.585429 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="d250eeab-d323-428b-95de-ff6d859ee48b" containerName="mariadb-database-create" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.585493 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="c449afe5-f791-4d64-9d5d-f3222f7a9f40" containerName="mariadb-database-create" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.585555 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="94f8ea93-8124-4473-9a83-c70e83c642f0" containerName="swift-ring-rebalance" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.585606 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3c6c756-7f34-4dad-a114-9e080ec40524" containerName="dnsmasq-dns" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.585658 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="fee065f1-f9a2-43bd-ae70-94d196555b5f" containerName="mariadb-database-create" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.586895 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c969-account-create-n4qx8" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.590363 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c969-account-create-n4qx8"] Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.591260 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.627077 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtv4m\" (UniqueName: \"kubernetes.io/projected/409c0051-d099-44b5-97bb-93d7a47a91e6-kube-api-access-mtv4m\") pod \"keystone-c969-account-create-n4qx8\" (UID: \"409c0051-d099-44b5-97bb-93d7a47a91e6\") " pod="openstack/keystone-c969-account-create-n4qx8" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.627181 4727 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/94f8ea93-8124-4473-9a83-c70e83c642f0-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.627197 4727 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/94f8ea93-8124-4473-9a83-c70e83c642f0-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.627210 4727 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94f8ea93-8124-4473-9a83-c70e83c642f0-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.627222 4727 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/94f8ea93-8124-4473-9a83-c70e83c642f0-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.627236 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94f8ea93-8124-4473-9a83-c70e83c642f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.627248 4727 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/94f8ea93-8124-4473-9a83-c70e83c642f0-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.627259 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhxkt\" (UniqueName: \"kubernetes.io/projected/94f8ea93-8124-4473-9a83-c70e83c642f0-kube-api-access-bhxkt\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.728753 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtv4m\" (UniqueName: \"kubernetes.io/projected/409c0051-d099-44b5-97bb-93d7a47a91e6-kube-api-access-mtv4m\") pod \"keystone-c969-account-create-n4qx8\" (UID: \"409c0051-d099-44b5-97bb-93d7a47a91e6\") " pod="openstack/keystone-c969-account-create-n4qx8" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.744640 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtv4m\" (UniqueName: \"kubernetes.io/projected/409c0051-d099-44b5-97bb-93d7a47a91e6-kube-api-access-mtv4m\") pod \"keystone-c969-account-create-n4qx8\" (UID: \"409c0051-d099-44b5-97bb-93d7a47a91e6\") " pod="openstack/keystone-c969-account-create-n4qx8" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.877478 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-82e6-account-create-tbgvp"] Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.879273 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-82e6-account-create-tbgvp" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.881705 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.889838 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-82e6-account-create-tbgvp"] Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.908425 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c969-account-create-n4qx8" Oct 01 12:52:30 crc kubenswrapper[4727]: I1001 12:52:30.931465 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxqtt\" (UniqueName: \"kubernetes.io/projected/4f48d653-b71d-4f18-be4f-c8883b023f59-kube-api-access-pxqtt\") pod \"placement-82e6-account-create-tbgvp\" (UID: \"4f48d653-b71d-4f18-be4f-c8883b023f59\") " pod="openstack/placement-82e6-account-create-tbgvp" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.032490 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxqtt\" (UniqueName: \"kubernetes.io/projected/4f48d653-b71d-4f18-be4f-c8883b023f59-kube-api-access-pxqtt\") pod \"placement-82e6-account-create-tbgvp\" (UID: \"4f48d653-b71d-4f18-be4f-c8883b023f59\") " pod="openstack/placement-82e6-account-create-tbgvp" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.054247 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxqtt\" (UniqueName: \"kubernetes.io/projected/4f48d653-b71d-4f18-be4f-c8883b023f59-kube-api-access-pxqtt\") pod \"placement-82e6-account-create-tbgvp\" (UID: \"4f48d653-b71d-4f18-be4f-c8883b023f59\") " pod="openstack/placement-82e6-account-create-tbgvp" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.130959 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c969-account-create-n4qx8"] Oct 01 12:52:31 crc kubenswrapper[4727]: W1001 12:52:31.140843 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod409c0051_d099_44b5_97bb_93d7a47a91e6.slice/crio-593105e2e03201d4927c6aa3d035ea5a57aecc0aaffcb768d10b5b7b791e7fb4 WatchSource:0}: Error finding container 593105e2e03201d4927c6aa3d035ea5a57aecc0aaffcb768d10b5b7b791e7fb4: Status 404 returned error can't find the container with id 593105e2e03201d4927c6aa3d035ea5a57aecc0aaffcb768d10b5b7b791e7fb4 Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.171372 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jwf2c" event={"ID":"94f8ea93-8124-4473-9a83-c70e83c642f0","Type":"ContainerDied","Data":"5a72a5a39993d3fb5a16b56051cd4e46790c7e7a9677fdc53e734083df98189f"} Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.171414 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a72a5a39993d3fb5a16b56051cd4e46790c7e7a9677fdc53e734083df98189f" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.171464 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jwf2c" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.173073 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d4f71a40-0089-4219-9ff4-837dfaf28b74","Type":"ContainerStarted","Data":"3bbb7810fe31db4b34e95a74f9b0faa304eb6d51f101f5b2b8c1169efde40e4b"} Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.174256 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c969-account-create-n4qx8" event={"ID":"409c0051-d099-44b5-97bb-93d7a47a91e6","Type":"ContainerStarted","Data":"593105e2e03201d4927c6aa3d035ea5a57aecc0aaffcb768d10b5b7b791e7fb4"} Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.204962 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-82e6-account-create-tbgvp" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.211931 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-691e-account-create-xrs5s"] Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.213666 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-691e-account-create-xrs5s" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.219775 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.227921 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-691e-account-create-xrs5s"] Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.235764 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnkj4\" (UniqueName: \"kubernetes.io/projected/1c5853f7-9f25-4027-a1bd-0917f55f0fb5-kube-api-access-dnkj4\") pod \"glance-691e-account-create-xrs5s\" (UID: \"1c5853f7-9f25-4027-a1bd-0917f55f0fb5\") " pod="openstack/glance-691e-account-create-xrs5s" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.337573 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnkj4\" (UniqueName: \"kubernetes.io/projected/1c5853f7-9f25-4027-a1bd-0917f55f0fb5-kube-api-access-dnkj4\") pod \"glance-691e-account-create-xrs5s\" (UID: \"1c5853f7-9f25-4027-a1bd-0917f55f0fb5\") " pod="openstack/glance-691e-account-create-xrs5s" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.340906 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-v56sx" podUID="fb0c554e-ed3f-4476-9963-dabc0089698d" containerName="ovn-controller" probeResult="failure" output=< Oct 01 12:52:31 crc kubenswrapper[4727]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 01 12:52:31 crc kubenswrapper[4727]: > Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.357416 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnkj4\" (UniqueName: \"kubernetes.io/projected/1c5853f7-9f25-4027-a1bd-0917f55f0fb5-kube-api-access-dnkj4\") pod \"glance-691e-account-create-xrs5s\" (UID: \"1c5853f7-9f25-4027-a1bd-0917f55f0fb5\") " pod="openstack/glance-691e-account-create-xrs5s" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.437083 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-f9xhb" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.444666 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-f9xhb" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.549313 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-691e-account-create-xrs5s" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.678232 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-82e6-account-create-tbgvp"] Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.697901 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-v56sx-config-2dmkb"] Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.701447 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v56sx-config-2dmkb" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.706988 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.707986 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v56sx-config-2dmkb"] Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.744917 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/12d7536f-9998-499a-94a2-547b3d02ae8a-var-run-ovn\") pod \"ovn-controller-v56sx-config-2dmkb\" (UID: \"12d7536f-9998-499a-94a2-547b3d02ae8a\") " pod="openstack/ovn-controller-v56sx-config-2dmkb" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.747084 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kqnj\" (UniqueName: \"kubernetes.io/projected/12d7536f-9998-499a-94a2-547b3d02ae8a-kube-api-access-9kqnj\") pod \"ovn-controller-v56sx-config-2dmkb\" (UID: \"12d7536f-9998-499a-94a2-547b3d02ae8a\") " pod="openstack/ovn-controller-v56sx-config-2dmkb" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.747396 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/12d7536f-9998-499a-94a2-547b3d02ae8a-additional-scripts\") pod \"ovn-controller-v56sx-config-2dmkb\" (UID: \"12d7536f-9998-499a-94a2-547b3d02ae8a\") " pod="openstack/ovn-controller-v56sx-config-2dmkb" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.747938 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/12d7536f-9998-499a-94a2-547b3d02ae8a-var-run\") pod \"ovn-controller-v56sx-config-2dmkb\" (UID: \"12d7536f-9998-499a-94a2-547b3d02ae8a\") " pod="openstack/ovn-controller-v56sx-config-2dmkb" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.748178 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12d7536f-9998-499a-94a2-547b3d02ae8a-scripts\") pod \"ovn-controller-v56sx-config-2dmkb\" (UID: \"12d7536f-9998-499a-94a2-547b3d02ae8a\") " pod="openstack/ovn-controller-v56sx-config-2dmkb" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.748547 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/12d7536f-9998-499a-94a2-547b3d02ae8a-var-log-ovn\") pod \"ovn-controller-v56sx-config-2dmkb\" (UID: \"12d7536f-9998-499a-94a2-547b3d02ae8a\") " pod="openstack/ovn-controller-v56sx-config-2dmkb" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.850902 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/12d7536f-9998-499a-94a2-547b3d02ae8a-var-log-ovn\") pod \"ovn-controller-v56sx-config-2dmkb\" (UID: \"12d7536f-9998-499a-94a2-547b3d02ae8a\") " pod="openstack/ovn-controller-v56sx-config-2dmkb" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.850975 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/12d7536f-9998-499a-94a2-547b3d02ae8a-var-run-ovn\") pod \"ovn-controller-v56sx-config-2dmkb\" (UID: \"12d7536f-9998-499a-94a2-547b3d02ae8a\") " pod="openstack/ovn-controller-v56sx-config-2dmkb" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.851045 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kqnj\" (UniqueName: \"kubernetes.io/projected/12d7536f-9998-499a-94a2-547b3d02ae8a-kube-api-access-9kqnj\") pod \"ovn-controller-v56sx-config-2dmkb\" (UID: \"12d7536f-9998-499a-94a2-547b3d02ae8a\") " pod="openstack/ovn-controller-v56sx-config-2dmkb" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.851096 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/12d7536f-9998-499a-94a2-547b3d02ae8a-additional-scripts\") pod \"ovn-controller-v56sx-config-2dmkb\" (UID: \"12d7536f-9998-499a-94a2-547b3d02ae8a\") " pod="openstack/ovn-controller-v56sx-config-2dmkb" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.851569 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/12d7536f-9998-499a-94a2-547b3d02ae8a-var-run\") pod \"ovn-controller-v56sx-config-2dmkb\" (UID: \"12d7536f-9998-499a-94a2-547b3d02ae8a\") " pod="openstack/ovn-controller-v56sx-config-2dmkb" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.851605 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12d7536f-9998-499a-94a2-547b3d02ae8a-scripts\") pod \"ovn-controller-v56sx-config-2dmkb\" (UID: \"12d7536f-9998-499a-94a2-547b3d02ae8a\") " pod="openstack/ovn-controller-v56sx-config-2dmkb" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.854346 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/12d7536f-9998-499a-94a2-547b3d02ae8a-var-log-ovn\") pod \"ovn-controller-v56sx-config-2dmkb\" (UID: \"12d7536f-9998-499a-94a2-547b3d02ae8a\") " pod="openstack/ovn-controller-v56sx-config-2dmkb" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.854424 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/12d7536f-9998-499a-94a2-547b3d02ae8a-var-run-ovn\") pod \"ovn-controller-v56sx-config-2dmkb\" (UID: \"12d7536f-9998-499a-94a2-547b3d02ae8a\") " pod="openstack/ovn-controller-v56sx-config-2dmkb" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.854551 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/12d7536f-9998-499a-94a2-547b3d02ae8a-var-run\") pod \"ovn-controller-v56sx-config-2dmkb\" (UID: \"12d7536f-9998-499a-94a2-547b3d02ae8a\") " pod="openstack/ovn-controller-v56sx-config-2dmkb" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.854891 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/12d7536f-9998-499a-94a2-547b3d02ae8a-additional-scripts\") pod \"ovn-controller-v56sx-config-2dmkb\" (UID: \"12d7536f-9998-499a-94a2-547b3d02ae8a\") " pod="openstack/ovn-controller-v56sx-config-2dmkb" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.855392 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12d7536f-9998-499a-94a2-547b3d02ae8a-scripts\") pod \"ovn-controller-v56sx-config-2dmkb\" (UID: \"12d7536f-9998-499a-94a2-547b3d02ae8a\") " pod="openstack/ovn-controller-v56sx-config-2dmkb" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.892122 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kqnj\" (UniqueName: \"kubernetes.io/projected/12d7536f-9998-499a-94a2-547b3d02ae8a-kube-api-access-9kqnj\") pod \"ovn-controller-v56sx-config-2dmkb\" (UID: \"12d7536f-9998-499a-94a2-547b3d02ae8a\") " pod="openstack/ovn-controller-v56sx-config-2dmkb" Oct 01 12:52:31 crc kubenswrapper[4727]: I1001 12:52:31.982361 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v56sx-config-2dmkb" Oct 01 12:52:32 crc kubenswrapper[4727]: I1001 12:52:32.074384 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-691e-account-create-xrs5s"] Oct 01 12:52:32 crc kubenswrapper[4727]: I1001 12:52:32.223529 4727 generic.go:334] "Generic (PLEG): container finished" podID="409c0051-d099-44b5-97bb-93d7a47a91e6" containerID="9a589aab940c5d3eb6b9317b4e6cfbe014f2dc533aaf9c66e3830925ed99ebc0" exitCode=0 Oct 01 12:52:32 crc kubenswrapper[4727]: I1001 12:52:32.223630 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c969-account-create-n4qx8" event={"ID":"409c0051-d099-44b5-97bb-93d7a47a91e6","Type":"ContainerDied","Data":"9a589aab940c5d3eb6b9317b4e6cfbe014f2dc533aaf9c66e3830925ed99ebc0"} Oct 01 12:52:32 crc kubenswrapper[4727]: I1001 12:52:32.238146 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-82e6-account-create-tbgvp" event={"ID":"4f48d653-b71d-4f18-be4f-c8883b023f59","Type":"ContainerStarted","Data":"5590afea4d629222d948fc2cd4248dd205bef8cfb95e13be86ef218ec42481b9"} Oct 01 12:52:32 crc kubenswrapper[4727]: I1001 12:52:32.238201 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-82e6-account-create-tbgvp" event={"ID":"4f48d653-b71d-4f18-be4f-c8883b023f59","Type":"ContainerStarted","Data":"1ffd3620920f77bb4642b1b17ecea2d927fe58288788c1cd244b122f5f5254b3"} Oct 01 12:52:32 crc kubenswrapper[4727]: I1001 12:52:32.273635 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-82e6-account-create-tbgvp" podStartSLOduration=2.273608444 podStartE2EDuration="2.273608444s" podCreationTimestamp="2025-10-01 12:52:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:52:32.265182289 +0000 UTC m=+930.586537126" watchObservedRunningTime="2025-10-01 12:52:32.273608444 +0000 UTC m=+930.594963281" Oct 01 12:52:32 crc kubenswrapper[4727]: W1001 12:52:32.749678 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c5853f7_9f25_4027_a1bd_0917f55f0fb5.slice/crio-82f5298c0c4342f674876ab682ad5f1be13517c95c9845ffaf3f2b2ed19e815f WatchSource:0}: Error finding container 82f5298c0c4342f674876ab682ad5f1be13517c95c9845ffaf3f2b2ed19e815f: Status 404 returned error can't find the container with id 82f5298c0c4342f674876ab682ad5f1be13517c95c9845ffaf3f2b2ed19e815f Oct 01 12:52:33 crc kubenswrapper[4727]: I1001 12:52:33.255202 4727 generic.go:334] "Generic (PLEG): container finished" podID="4f48d653-b71d-4f18-be4f-c8883b023f59" containerID="5590afea4d629222d948fc2cd4248dd205bef8cfb95e13be86ef218ec42481b9" exitCode=0 Oct 01 12:52:33 crc kubenswrapper[4727]: I1001 12:52:33.255559 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-82e6-account-create-tbgvp" event={"ID":"4f48d653-b71d-4f18-be4f-c8883b023f59","Type":"ContainerDied","Data":"5590afea4d629222d948fc2cd4248dd205bef8cfb95e13be86ef218ec42481b9"} Oct 01 12:52:33 crc kubenswrapper[4727]: I1001 12:52:33.259534 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-691e-account-create-xrs5s" event={"ID":"1c5853f7-9f25-4027-a1bd-0917f55f0fb5","Type":"ContainerStarted","Data":"c89584cce96dda52b5be727159cc31fe921bf13ebf960fdf9dcc8d50a5d5cefb"} Oct 01 12:52:33 crc kubenswrapper[4727]: I1001 12:52:33.259569 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-691e-account-create-xrs5s" event={"ID":"1c5853f7-9f25-4027-a1bd-0917f55f0fb5","Type":"ContainerStarted","Data":"82f5298c0c4342f674876ab682ad5f1be13517c95c9845ffaf3f2b2ed19e815f"} Oct 01 12:52:33 crc kubenswrapper[4727]: I1001 12:52:33.292419 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:52:33 crc kubenswrapper[4727]: I1001 12:52:33.292704 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:52:33 crc kubenswrapper[4727]: I1001 12:52:33.307660 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v56sx-config-2dmkb"] Oct 01 12:52:33 crc kubenswrapper[4727]: W1001 12:52:33.307950 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12d7536f_9998_499a_94a2_547b3d02ae8a.slice/crio-bc22097b212ad50753c29422f4c93575c7909339c75b22ee6f5ffbcfb53b4346 WatchSource:0}: Error finding container bc22097b212ad50753c29422f4c93575c7909339c75b22ee6f5ffbcfb53b4346: Status 404 returned error can't find the container with id bc22097b212ad50753c29422f4c93575c7909339c75b22ee6f5ffbcfb53b4346 Oct 01 12:52:33 crc kubenswrapper[4727]: I1001 12:52:33.322958 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-691e-account-create-xrs5s" podStartSLOduration=2.322933162 podStartE2EDuration="2.322933162s" podCreationTimestamp="2025-10-01 12:52:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:52:33.302481897 +0000 UTC m=+931.623836724" watchObservedRunningTime="2025-10-01 12:52:33.322933162 +0000 UTC m=+931.644288009" Oct 01 12:52:33 crc kubenswrapper[4727]: I1001 12:52:33.640495 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c969-account-create-n4qx8" Oct 01 12:52:33 crc kubenswrapper[4727]: I1001 12:52:33.720370 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtv4m\" (UniqueName: \"kubernetes.io/projected/409c0051-d099-44b5-97bb-93d7a47a91e6-kube-api-access-mtv4m\") pod \"409c0051-d099-44b5-97bb-93d7a47a91e6\" (UID: \"409c0051-d099-44b5-97bb-93d7a47a91e6\") " Oct 01 12:52:33 crc kubenswrapper[4727]: I1001 12:52:33.726169 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/409c0051-d099-44b5-97bb-93d7a47a91e6-kube-api-access-mtv4m" (OuterVolumeSpecName: "kube-api-access-mtv4m") pod "409c0051-d099-44b5-97bb-93d7a47a91e6" (UID: "409c0051-d099-44b5-97bb-93d7a47a91e6"). InnerVolumeSpecName "kube-api-access-mtv4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:52:33 crc kubenswrapper[4727]: I1001 12:52:33.822040 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtv4m\" (UniqueName: \"kubernetes.io/projected/409c0051-d099-44b5-97bb-93d7a47a91e6-kube-api-access-mtv4m\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:34 crc kubenswrapper[4727]: I1001 12:52:34.269275 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d4f71a40-0089-4219-9ff4-837dfaf28b74","Type":"ContainerStarted","Data":"a72fd4a1924bcdc80c9c0d61dab29fa070a7420c2b0ac10b1841e467c19329d2"} Oct 01 12:52:34 crc kubenswrapper[4727]: I1001 12:52:34.269656 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d4f71a40-0089-4219-9ff4-837dfaf28b74","Type":"ContainerStarted","Data":"9b1efcb9adc2b3cbed7f553d869c216cdddbbdb5e6a947885e05007e30ea2d76"} Oct 01 12:52:34 crc kubenswrapper[4727]: I1001 12:52:34.269673 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d4f71a40-0089-4219-9ff4-837dfaf28b74","Type":"ContainerStarted","Data":"bf2429490241150a3bf256b1eba83702bffb1b12824275d97d122a28257b0219"} Oct 01 12:52:34 crc kubenswrapper[4727]: I1001 12:52:34.270738 4727 generic.go:334] "Generic (PLEG): container finished" podID="12d7536f-9998-499a-94a2-547b3d02ae8a" containerID="a4603e5caddbf7627c75b0b98e350ee46f9285b810ec96e9169baf930f9cec2d" exitCode=0 Oct 01 12:52:34 crc kubenswrapper[4727]: I1001 12:52:34.270796 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v56sx-config-2dmkb" event={"ID":"12d7536f-9998-499a-94a2-547b3d02ae8a","Type":"ContainerDied","Data":"a4603e5caddbf7627c75b0b98e350ee46f9285b810ec96e9169baf930f9cec2d"} Oct 01 12:52:34 crc kubenswrapper[4727]: I1001 12:52:34.270821 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v56sx-config-2dmkb" event={"ID":"12d7536f-9998-499a-94a2-547b3d02ae8a","Type":"ContainerStarted","Data":"bc22097b212ad50753c29422f4c93575c7909339c75b22ee6f5ffbcfb53b4346"} Oct 01 12:52:34 crc kubenswrapper[4727]: I1001 12:52:34.272421 4727 generic.go:334] "Generic (PLEG): container finished" podID="1c5853f7-9f25-4027-a1bd-0917f55f0fb5" containerID="c89584cce96dda52b5be727159cc31fe921bf13ebf960fdf9dcc8d50a5d5cefb" exitCode=0 Oct 01 12:52:34 crc kubenswrapper[4727]: I1001 12:52:34.272468 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-691e-account-create-xrs5s" event={"ID":"1c5853f7-9f25-4027-a1bd-0917f55f0fb5","Type":"ContainerDied","Data":"c89584cce96dda52b5be727159cc31fe921bf13ebf960fdf9dcc8d50a5d5cefb"} Oct 01 12:52:34 crc kubenswrapper[4727]: I1001 12:52:34.276946 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c969-account-create-n4qx8" Oct 01 12:52:34 crc kubenswrapper[4727]: I1001 12:52:34.276963 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c969-account-create-n4qx8" event={"ID":"409c0051-d099-44b5-97bb-93d7a47a91e6","Type":"ContainerDied","Data":"593105e2e03201d4927c6aa3d035ea5a57aecc0aaffcb768d10b5b7b791e7fb4"} Oct 01 12:52:34 crc kubenswrapper[4727]: I1001 12:52:34.277010 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="593105e2e03201d4927c6aa3d035ea5a57aecc0aaffcb768d10b5b7b791e7fb4" Oct 01 12:52:34 crc kubenswrapper[4727]: I1001 12:52:34.616302 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-82e6-account-create-tbgvp" Oct 01 12:52:34 crc kubenswrapper[4727]: I1001 12:52:34.737950 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxqtt\" (UniqueName: \"kubernetes.io/projected/4f48d653-b71d-4f18-be4f-c8883b023f59-kube-api-access-pxqtt\") pod \"4f48d653-b71d-4f18-be4f-c8883b023f59\" (UID: \"4f48d653-b71d-4f18-be4f-c8883b023f59\") " Oct 01 12:52:34 crc kubenswrapper[4727]: I1001 12:52:34.747368 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f48d653-b71d-4f18-be4f-c8883b023f59-kube-api-access-pxqtt" (OuterVolumeSpecName: "kube-api-access-pxqtt") pod "4f48d653-b71d-4f18-be4f-c8883b023f59" (UID: "4f48d653-b71d-4f18-be4f-c8883b023f59"). InnerVolumeSpecName "kube-api-access-pxqtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:52:34 crc kubenswrapper[4727]: I1001 12:52:34.840615 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxqtt\" (UniqueName: \"kubernetes.io/projected/4f48d653-b71d-4f18-be4f-c8883b023f59-kube-api-access-pxqtt\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:35 crc kubenswrapper[4727]: I1001 12:52:35.287058 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d4f71a40-0089-4219-9ff4-837dfaf28b74","Type":"ContainerStarted","Data":"3920bbb495e754a3ab825c5c511b90067b02e44f1a02f095606d432cbf1bfdb2"} Oct 01 12:52:35 crc kubenswrapper[4727]: I1001 12:52:35.288879 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-82e6-account-create-tbgvp" event={"ID":"4f48d653-b71d-4f18-be4f-c8883b023f59","Type":"ContainerDied","Data":"1ffd3620920f77bb4642b1b17ecea2d927fe58288788c1cd244b122f5f5254b3"} Oct 01 12:52:35 crc kubenswrapper[4727]: I1001 12:52:35.288913 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ffd3620920f77bb4642b1b17ecea2d927fe58288788c1cd244b122f5f5254b3" Oct 01 12:52:35 crc kubenswrapper[4727]: I1001 12:52:35.289216 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-82e6-account-create-tbgvp" Oct 01 12:52:35 crc kubenswrapper[4727]: I1001 12:52:35.671188 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-691e-account-create-xrs5s" Oct 01 12:52:35 crc kubenswrapper[4727]: I1001 12:52:35.680970 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v56sx-config-2dmkb" Oct 01 12:52:35 crc kubenswrapper[4727]: I1001 12:52:35.753561 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/12d7536f-9998-499a-94a2-547b3d02ae8a-var-run-ovn\") pod \"12d7536f-9998-499a-94a2-547b3d02ae8a\" (UID: \"12d7536f-9998-499a-94a2-547b3d02ae8a\") " Oct 01 12:52:35 crc kubenswrapper[4727]: I1001 12:52:35.753693 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnkj4\" (UniqueName: \"kubernetes.io/projected/1c5853f7-9f25-4027-a1bd-0917f55f0fb5-kube-api-access-dnkj4\") pod \"1c5853f7-9f25-4027-a1bd-0917f55f0fb5\" (UID: \"1c5853f7-9f25-4027-a1bd-0917f55f0fb5\") " Oct 01 12:52:35 crc kubenswrapper[4727]: I1001 12:52:35.753716 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/12d7536f-9998-499a-94a2-547b3d02ae8a-var-run\") pod \"12d7536f-9998-499a-94a2-547b3d02ae8a\" (UID: \"12d7536f-9998-499a-94a2-547b3d02ae8a\") " Oct 01 12:52:35 crc kubenswrapper[4727]: I1001 12:52:35.753741 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/12d7536f-9998-499a-94a2-547b3d02ae8a-additional-scripts\") pod \"12d7536f-9998-499a-94a2-547b3d02ae8a\" (UID: \"12d7536f-9998-499a-94a2-547b3d02ae8a\") " Oct 01 12:52:35 crc kubenswrapper[4727]: I1001 12:52:35.753771 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12d7536f-9998-499a-94a2-547b3d02ae8a-scripts\") pod \"12d7536f-9998-499a-94a2-547b3d02ae8a\" (UID: \"12d7536f-9998-499a-94a2-547b3d02ae8a\") " Oct 01 12:52:35 crc kubenswrapper[4727]: I1001 12:52:35.753806 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kqnj\" (UniqueName: \"kubernetes.io/projected/12d7536f-9998-499a-94a2-547b3d02ae8a-kube-api-access-9kqnj\") pod \"12d7536f-9998-499a-94a2-547b3d02ae8a\" (UID: \"12d7536f-9998-499a-94a2-547b3d02ae8a\") " Oct 01 12:52:35 crc kubenswrapper[4727]: I1001 12:52:35.753886 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/12d7536f-9998-499a-94a2-547b3d02ae8a-var-log-ovn\") pod \"12d7536f-9998-499a-94a2-547b3d02ae8a\" (UID: \"12d7536f-9998-499a-94a2-547b3d02ae8a\") " Oct 01 12:52:35 crc kubenswrapper[4727]: I1001 12:52:35.754223 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12d7536f-9998-499a-94a2-547b3d02ae8a-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "12d7536f-9998-499a-94a2-547b3d02ae8a" (UID: "12d7536f-9998-499a-94a2-547b3d02ae8a"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:52:35 crc kubenswrapper[4727]: I1001 12:52:35.754258 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12d7536f-9998-499a-94a2-547b3d02ae8a-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "12d7536f-9998-499a-94a2-547b3d02ae8a" (UID: "12d7536f-9998-499a-94a2-547b3d02ae8a"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:52:35 crc kubenswrapper[4727]: I1001 12:52:35.754619 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12d7536f-9998-499a-94a2-547b3d02ae8a-var-run" (OuterVolumeSpecName: "var-run") pod "12d7536f-9998-499a-94a2-547b3d02ae8a" (UID: "12d7536f-9998-499a-94a2-547b3d02ae8a"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:52:35 crc kubenswrapper[4727]: I1001 12:52:35.756054 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12d7536f-9998-499a-94a2-547b3d02ae8a-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "12d7536f-9998-499a-94a2-547b3d02ae8a" (UID: "12d7536f-9998-499a-94a2-547b3d02ae8a"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:52:35 crc kubenswrapper[4727]: I1001 12:52:35.756516 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12d7536f-9998-499a-94a2-547b3d02ae8a-scripts" (OuterVolumeSpecName: "scripts") pod "12d7536f-9998-499a-94a2-547b3d02ae8a" (UID: "12d7536f-9998-499a-94a2-547b3d02ae8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:52:35 crc kubenswrapper[4727]: I1001 12:52:35.759908 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c5853f7-9f25-4027-a1bd-0917f55f0fb5-kube-api-access-dnkj4" (OuterVolumeSpecName: "kube-api-access-dnkj4") pod "1c5853f7-9f25-4027-a1bd-0917f55f0fb5" (UID: "1c5853f7-9f25-4027-a1bd-0917f55f0fb5"). InnerVolumeSpecName "kube-api-access-dnkj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:52:35 crc kubenswrapper[4727]: I1001 12:52:35.760029 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12d7536f-9998-499a-94a2-547b3d02ae8a-kube-api-access-9kqnj" (OuterVolumeSpecName: "kube-api-access-9kqnj") pod "12d7536f-9998-499a-94a2-547b3d02ae8a" (UID: "12d7536f-9998-499a-94a2-547b3d02ae8a"). InnerVolumeSpecName "kube-api-access-9kqnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:52:35 crc kubenswrapper[4727]: I1001 12:52:35.855154 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnkj4\" (UniqueName: \"kubernetes.io/projected/1c5853f7-9f25-4027-a1bd-0917f55f0fb5-kube-api-access-dnkj4\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:35 crc kubenswrapper[4727]: I1001 12:52:35.855197 4727 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/12d7536f-9998-499a-94a2-547b3d02ae8a-var-run\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:35 crc kubenswrapper[4727]: I1001 12:52:35.855209 4727 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/12d7536f-9998-499a-94a2-547b3d02ae8a-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:35 crc kubenswrapper[4727]: I1001 12:52:35.855217 4727 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12d7536f-9998-499a-94a2-547b3d02ae8a-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:35 crc kubenswrapper[4727]: I1001 12:52:35.855225 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kqnj\" (UniqueName: \"kubernetes.io/projected/12d7536f-9998-499a-94a2-547b3d02ae8a-kube-api-access-9kqnj\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:35 crc kubenswrapper[4727]: I1001 12:52:35.855233 4727 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/12d7536f-9998-499a-94a2-547b3d02ae8a-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:35 crc kubenswrapper[4727]: I1001 12:52:35.855242 4727 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/12d7536f-9998-499a-94a2-547b3d02ae8a-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:36 crc kubenswrapper[4727]: I1001 12:52:36.305709 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v56sx-config-2dmkb" event={"ID":"12d7536f-9998-499a-94a2-547b3d02ae8a","Type":"ContainerDied","Data":"bc22097b212ad50753c29422f4c93575c7909339c75b22ee6f5ffbcfb53b4346"} Oct 01 12:52:36 crc kubenswrapper[4727]: I1001 12:52:36.305775 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc22097b212ad50753c29422f4c93575c7909339c75b22ee6f5ffbcfb53b4346" Oct 01 12:52:36 crc kubenswrapper[4727]: I1001 12:52:36.305723 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v56sx-config-2dmkb" Oct 01 12:52:36 crc kubenswrapper[4727]: I1001 12:52:36.313456 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-691e-account-create-xrs5s" event={"ID":"1c5853f7-9f25-4027-a1bd-0917f55f0fb5","Type":"ContainerDied","Data":"82f5298c0c4342f674876ab682ad5f1be13517c95c9845ffaf3f2b2ed19e815f"} Oct 01 12:52:36 crc kubenswrapper[4727]: I1001 12:52:36.313489 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82f5298c0c4342f674876ab682ad5f1be13517c95c9845ffaf3f2b2ed19e815f" Oct 01 12:52:36 crc kubenswrapper[4727]: I1001 12:52:36.313537 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-691e-account-create-xrs5s" Oct 01 12:52:36 crc kubenswrapper[4727]: I1001 12:52:36.350190 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-v56sx" Oct 01 12:52:36 crc kubenswrapper[4727]: I1001 12:52:36.823339 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-v56sx-config-2dmkb"] Oct 01 12:52:36 crc kubenswrapper[4727]: I1001 12:52:36.831488 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-v56sx-config-2dmkb"] Oct 01 12:52:36 crc kubenswrapper[4727]: I1001 12:52:36.932747 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-v56sx-config-2b9ng"] Oct 01 12:52:36 crc kubenswrapper[4727]: E1001 12:52:36.933105 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="409c0051-d099-44b5-97bb-93d7a47a91e6" containerName="mariadb-account-create" Oct 01 12:52:36 crc kubenswrapper[4727]: I1001 12:52:36.933119 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="409c0051-d099-44b5-97bb-93d7a47a91e6" containerName="mariadb-account-create" Oct 01 12:52:36 crc kubenswrapper[4727]: E1001 12:52:36.933141 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12d7536f-9998-499a-94a2-547b3d02ae8a" containerName="ovn-config" Oct 01 12:52:36 crc kubenswrapper[4727]: I1001 12:52:36.933147 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="12d7536f-9998-499a-94a2-547b3d02ae8a" containerName="ovn-config" Oct 01 12:52:36 crc kubenswrapper[4727]: E1001 12:52:36.933156 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f48d653-b71d-4f18-be4f-c8883b023f59" containerName="mariadb-account-create" Oct 01 12:52:36 crc kubenswrapper[4727]: I1001 12:52:36.933162 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f48d653-b71d-4f18-be4f-c8883b023f59" containerName="mariadb-account-create" Oct 01 12:52:36 crc kubenswrapper[4727]: E1001 12:52:36.933183 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c5853f7-9f25-4027-a1bd-0917f55f0fb5" containerName="mariadb-account-create" Oct 01 12:52:36 crc kubenswrapper[4727]: I1001 12:52:36.933189 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c5853f7-9f25-4027-a1bd-0917f55f0fb5" containerName="mariadb-account-create" Oct 01 12:52:36 crc kubenswrapper[4727]: I1001 12:52:36.933352 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="409c0051-d099-44b5-97bb-93d7a47a91e6" containerName="mariadb-account-create" Oct 01 12:52:36 crc kubenswrapper[4727]: I1001 12:52:36.933385 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c5853f7-9f25-4027-a1bd-0917f55f0fb5" containerName="mariadb-account-create" Oct 01 12:52:36 crc kubenswrapper[4727]: I1001 12:52:36.933406 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f48d653-b71d-4f18-be4f-c8883b023f59" containerName="mariadb-account-create" Oct 01 12:52:36 crc kubenswrapper[4727]: I1001 12:52:36.933418 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="12d7536f-9998-499a-94a2-547b3d02ae8a" containerName="ovn-config" Oct 01 12:52:36 crc kubenswrapper[4727]: I1001 12:52:36.933905 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v56sx-config-2b9ng" Oct 01 12:52:36 crc kubenswrapper[4727]: I1001 12:52:36.935988 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 01 12:52:36 crc kubenswrapper[4727]: I1001 12:52:36.949235 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v56sx-config-2b9ng"] Oct 01 12:52:36 crc kubenswrapper[4727]: I1001 12:52:36.972970 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c81e4757-31e3-4edd-921c-324a8085be56-var-run-ovn\") pod \"ovn-controller-v56sx-config-2b9ng\" (UID: \"c81e4757-31e3-4edd-921c-324a8085be56\") " pod="openstack/ovn-controller-v56sx-config-2b9ng" Oct 01 12:52:36 crc kubenswrapper[4727]: I1001 12:52:36.973033 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c81e4757-31e3-4edd-921c-324a8085be56-scripts\") pod \"ovn-controller-v56sx-config-2b9ng\" (UID: \"c81e4757-31e3-4edd-921c-324a8085be56\") " pod="openstack/ovn-controller-v56sx-config-2b9ng" Oct 01 12:52:36 crc kubenswrapper[4727]: I1001 12:52:36.973081 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c81e4757-31e3-4edd-921c-324a8085be56-var-run\") pod \"ovn-controller-v56sx-config-2b9ng\" (UID: \"c81e4757-31e3-4edd-921c-324a8085be56\") " pod="openstack/ovn-controller-v56sx-config-2b9ng" Oct 01 12:52:36 crc kubenswrapper[4727]: I1001 12:52:36.973124 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c81e4757-31e3-4edd-921c-324a8085be56-var-log-ovn\") pod \"ovn-controller-v56sx-config-2b9ng\" (UID: \"c81e4757-31e3-4edd-921c-324a8085be56\") " pod="openstack/ovn-controller-v56sx-config-2b9ng" Oct 01 12:52:36 crc kubenswrapper[4727]: I1001 12:52:36.973206 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p22qj\" (UniqueName: \"kubernetes.io/projected/c81e4757-31e3-4edd-921c-324a8085be56-kube-api-access-p22qj\") pod \"ovn-controller-v56sx-config-2b9ng\" (UID: \"c81e4757-31e3-4edd-921c-324a8085be56\") " pod="openstack/ovn-controller-v56sx-config-2b9ng" Oct 01 12:52:36 crc kubenswrapper[4727]: I1001 12:52:36.973245 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c81e4757-31e3-4edd-921c-324a8085be56-additional-scripts\") pod \"ovn-controller-v56sx-config-2b9ng\" (UID: \"c81e4757-31e3-4edd-921c-324a8085be56\") " pod="openstack/ovn-controller-v56sx-config-2b9ng" Oct 01 12:52:37 crc kubenswrapper[4727]: I1001 12:52:37.075038 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c81e4757-31e3-4edd-921c-324a8085be56-var-run-ovn\") pod \"ovn-controller-v56sx-config-2b9ng\" (UID: \"c81e4757-31e3-4edd-921c-324a8085be56\") " pod="openstack/ovn-controller-v56sx-config-2b9ng" Oct 01 12:52:37 crc kubenswrapper[4727]: I1001 12:52:37.075098 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c81e4757-31e3-4edd-921c-324a8085be56-scripts\") pod \"ovn-controller-v56sx-config-2b9ng\" (UID: \"c81e4757-31e3-4edd-921c-324a8085be56\") " pod="openstack/ovn-controller-v56sx-config-2b9ng" Oct 01 12:52:37 crc kubenswrapper[4727]: I1001 12:52:37.075141 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c81e4757-31e3-4edd-921c-324a8085be56-var-run\") pod \"ovn-controller-v56sx-config-2b9ng\" (UID: \"c81e4757-31e3-4edd-921c-324a8085be56\") " pod="openstack/ovn-controller-v56sx-config-2b9ng" Oct 01 12:52:37 crc kubenswrapper[4727]: I1001 12:52:37.075199 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c81e4757-31e3-4edd-921c-324a8085be56-var-log-ovn\") pod \"ovn-controller-v56sx-config-2b9ng\" (UID: \"c81e4757-31e3-4edd-921c-324a8085be56\") " pod="openstack/ovn-controller-v56sx-config-2b9ng" Oct 01 12:52:37 crc kubenswrapper[4727]: I1001 12:52:37.075277 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p22qj\" (UniqueName: \"kubernetes.io/projected/c81e4757-31e3-4edd-921c-324a8085be56-kube-api-access-p22qj\") pod \"ovn-controller-v56sx-config-2b9ng\" (UID: \"c81e4757-31e3-4edd-921c-324a8085be56\") " pod="openstack/ovn-controller-v56sx-config-2b9ng" Oct 01 12:52:37 crc kubenswrapper[4727]: I1001 12:52:37.075332 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c81e4757-31e3-4edd-921c-324a8085be56-additional-scripts\") pod \"ovn-controller-v56sx-config-2b9ng\" (UID: \"c81e4757-31e3-4edd-921c-324a8085be56\") " pod="openstack/ovn-controller-v56sx-config-2b9ng" Oct 01 12:52:37 crc kubenswrapper[4727]: I1001 12:52:37.075396 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c81e4757-31e3-4edd-921c-324a8085be56-var-run-ovn\") pod \"ovn-controller-v56sx-config-2b9ng\" (UID: \"c81e4757-31e3-4edd-921c-324a8085be56\") " pod="openstack/ovn-controller-v56sx-config-2b9ng" Oct 01 12:52:37 crc kubenswrapper[4727]: I1001 12:52:37.075406 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c81e4757-31e3-4edd-921c-324a8085be56-var-run\") pod \"ovn-controller-v56sx-config-2b9ng\" (UID: \"c81e4757-31e3-4edd-921c-324a8085be56\") " pod="openstack/ovn-controller-v56sx-config-2b9ng" Oct 01 12:52:37 crc kubenswrapper[4727]: I1001 12:52:37.075459 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c81e4757-31e3-4edd-921c-324a8085be56-var-log-ovn\") pod \"ovn-controller-v56sx-config-2b9ng\" (UID: \"c81e4757-31e3-4edd-921c-324a8085be56\") " pod="openstack/ovn-controller-v56sx-config-2b9ng" Oct 01 12:52:37 crc kubenswrapper[4727]: I1001 12:52:37.076323 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c81e4757-31e3-4edd-921c-324a8085be56-additional-scripts\") pod \"ovn-controller-v56sx-config-2b9ng\" (UID: \"c81e4757-31e3-4edd-921c-324a8085be56\") " pod="openstack/ovn-controller-v56sx-config-2b9ng" Oct 01 12:52:37 crc kubenswrapper[4727]: I1001 12:52:37.077519 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c81e4757-31e3-4edd-921c-324a8085be56-scripts\") pod \"ovn-controller-v56sx-config-2b9ng\" (UID: \"c81e4757-31e3-4edd-921c-324a8085be56\") " pod="openstack/ovn-controller-v56sx-config-2b9ng" Oct 01 12:52:37 crc kubenswrapper[4727]: I1001 12:52:37.095669 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p22qj\" (UniqueName: \"kubernetes.io/projected/c81e4757-31e3-4edd-921c-324a8085be56-kube-api-access-p22qj\") pod \"ovn-controller-v56sx-config-2b9ng\" (UID: \"c81e4757-31e3-4edd-921c-324a8085be56\") " pod="openstack/ovn-controller-v56sx-config-2b9ng" Oct 01 12:52:37 crc kubenswrapper[4727]: I1001 12:52:37.260086 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v56sx-config-2b9ng" Oct 01 12:52:37 crc kubenswrapper[4727]: I1001 12:52:37.328486 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d4f71a40-0089-4219-9ff4-837dfaf28b74","Type":"ContainerStarted","Data":"c859e209749b4d00bc096c8efe63ee0cb1251df2f274741356e48a9cb792a2ad"} Oct 01 12:52:37 crc kubenswrapper[4727]: I1001 12:52:37.328536 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d4f71a40-0089-4219-9ff4-837dfaf28b74","Type":"ContainerStarted","Data":"1e2d03bddb34143bd44dad17fb970c2ab9a3a5c5d47c56964d8cfec99f06e608"} Oct 01 12:52:37 crc kubenswrapper[4727]: I1001 12:52:37.328547 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d4f71a40-0089-4219-9ff4-837dfaf28b74","Type":"ContainerStarted","Data":"d6d2e49c5c1c2b24796520b088a93567e9ecb797415db356605939273731ac5a"} Oct 01 12:52:37 crc kubenswrapper[4727]: I1001 12:52:37.328556 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d4f71a40-0089-4219-9ff4-837dfaf28b74","Type":"ContainerStarted","Data":"d7d328130462d7a0a7c966de69db62695ba7a8c08a56fc3c0744bbb97c35ecb5"} Oct 01 12:52:37 crc kubenswrapper[4727]: I1001 12:52:37.616330 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 01 12:52:37 crc kubenswrapper[4727]: I1001 12:52:37.722215 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-v56sx-config-2b9ng"] Oct 01 12:52:37 crc kubenswrapper[4727]: I1001 12:52:37.926686 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-75b77"] Oct 01 12:52:37 crc kubenswrapper[4727]: I1001 12:52:37.927947 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-75b77" Oct 01 12:52:37 crc kubenswrapper[4727]: I1001 12:52:37.939910 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-75b77"] Oct 01 12:52:37 crc kubenswrapper[4727]: I1001 12:52:37.992867 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq7dj\" (UniqueName: \"kubernetes.io/projected/fdf7d4ae-6ebf-4bdb-9a09-8aae270477a9-kube-api-access-kq7dj\") pod \"cinder-db-create-75b77\" (UID: \"fdf7d4ae-6ebf-4bdb-9a09-8aae270477a9\") " pod="openstack/cinder-db-create-75b77" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.036043 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-cth6t"] Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.037037 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cth6t" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.055798 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-cth6t"] Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.085843 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.093901 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq7dj\" (UniqueName: \"kubernetes.io/projected/fdf7d4ae-6ebf-4bdb-9a09-8aae270477a9-kube-api-access-kq7dj\") pod \"cinder-db-create-75b77\" (UID: \"fdf7d4ae-6ebf-4bdb-9a09-8aae270477a9\") " pod="openstack/cinder-db-create-75b77" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.094050 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwkw4\" (UniqueName: \"kubernetes.io/projected/04b59d75-3cf2-451d-b303-07dac30964e5-kube-api-access-qwkw4\") pod \"barbican-db-create-cth6t\" (UID: \"04b59d75-3cf2-451d-b303-07dac30964e5\") " pod="openstack/barbican-db-create-cth6t" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.126331 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq7dj\" (UniqueName: \"kubernetes.io/projected/fdf7d4ae-6ebf-4bdb-9a09-8aae270477a9-kube-api-access-kq7dj\") pod \"cinder-db-create-75b77\" (UID: \"fdf7d4ae-6ebf-4bdb-9a09-8aae270477a9\") " pod="openstack/cinder-db-create-75b77" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.198247 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwkw4\" (UniqueName: \"kubernetes.io/projected/04b59d75-3cf2-451d-b303-07dac30964e5-kube-api-access-qwkw4\") pod \"barbican-db-create-cth6t\" (UID: \"04b59d75-3cf2-451d-b303-07dac30964e5\") " pod="openstack/barbican-db-create-cth6t" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.227116 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwkw4\" (UniqueName: \"kubernetes.io/projected/04b59d75-3cf2-451d-b303-07dac30964e5-kube-api-access-qwkw4\") pod \"barbican-db-create-cth6t\" (UID: \"04b59d75-3cf2-451d-b303-07dac30964e5\") " pod="openstack/barbican-db-create-cth6t" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.256934 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-75b77" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.294230 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-h9swf"] Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.301403 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-h9swf" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.306985 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.307372 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.307605 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.308342 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jxtdc" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.324793 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-h9swf"] Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.348135 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-btgc6"] Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.349247 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-btgc6" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.363044 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cth6t" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.425883 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12d7536f-9998-499a-94a2-547b3d02ae8a" path="/var/lib/kubelet/pods/12d7536f-9998-499a-94a2-547b3d02ae8a/volumes" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.426723 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-btgc6"] Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.435720 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v56sx-config-2b9ng" event={"ID":"c81e4757-31e3-4edd-921c-324a8085be56","Type":"ContainerStarted","Data":"f07080fd9430a01d2c99c0c3de32e8c23556bc26cf9169584c1f40a6ac15656f"} Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.435769 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v56sx-config-2b9ng" event={"ID":"c81e4757-31e3-4edd-921c-324a8085be56","Type":"ContainerStarted","Data":"5d1d80b24958ea052c6f12147e906e71e92f5f9abc43de3b34dbf8a5f63fd353"} Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.462547 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-v56sx-config-2b9ng" podStartSLOduration=2.462526904 podStartE2EDuration="2.462526904s" podCreationTimestamp="2025-10-01 12:52:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:52:38.45541662 +0000 UTC m=+936.776771457" watchObservedRunningTime="2025-10-01 12:52:38.462526904 +0000 UTC m=+936.783881751" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.502316 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnplv\" (UniqueName: \"kubernetes.io/projected/615b1b59-cd92-4d09-bce0-5c3ee394a7b3-kube-api-access-jnplv\") pod \"keystone-db-sync-h9swf\" (UID: \"615b1b59-cd92-4d09-bce0-5c3ee394a7b3\") " pod="openstack/keystone-db-sync-h9swf" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.502393 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcn7m\" (UniqueName: \"kubernetes.io/projected/d5e1c176-87f7-401f-9137-00d70f843212-kube-api-access-dcn7m\") pod \"neutron-db-create-btgc6\" (UID: \"d5e1c176-87f7-401f-9137-00d70f843212\") " pod="openstack/neutron-db-create-btgc6" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.502425 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615b1b59-cd92-4d09-bce0-5c3ee394a7b3-combined-ca-bundle\") pod \"keystone-db-sync-h9swf\" (UID: \"615b1b59-cd92-4d09-bce0-5c3ee394a7b3\") " pod="openstack/keystone-db-sync-h9swf" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.502456 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/615b1b59-cd92-4d09-bce0-5c3ee394a7b3-config-data\") pod \"keystone-db-sync-h9swf\" (UID: \"615b1b59-cd92-4d09-bce0-5c3ee394a7b3\") " pod="openstack/keystone-db-sync-h9swf" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.608026 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615b1b59-cd92-4d09-bce0-5c3ee394a7b3-combined-ca-bundle\") pod \"keystone-db-sync-h9swf\" (UID: \"615b1b59-cd92-4d09-bce0-5c3ee394a7b3\") " pod="openstack/keystone-db-sync-h9swf" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.608471 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/615b1b59-cd92-4d09-bce0-5c3ee394a7b3-config-data\") pod \"keystone-db-sync-h9swf\" (UID: \"615b1b59-cd92-4d09-bce0-5c3ee394a7b3\") " pod="openstack/keystone-db-sync-h9swf" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.608616 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnplv\" (UniqueName: \"kubernetes.io/projected/615b1b59-cd92-4d09-bce0-5c3ee394a7b3-kube-api-access-jnplv\") pod \"keystone-db-sync-h9swf\" (UID: \"615b1b59-cd92-4d09-bce0-5c3ee394a7b3\") " pod="openstack/keystone-db-sync-h9swf" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.608693 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcn7m\" (UniqueName: \"kubernetes.io/projected/d5e1c176-87f7-401f-9137-00d70f843212-kube-api-access-dcn7m\") pod \"neutron-db-create-btgc6\" (UID: \"d5e1c176-87f7-401f-9137-00d70f843212\") " pod="openstack/neutron-db-create-btgc6" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.615292 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615b1b59-cd92-4d09-bce0-5c3ee394a7b3-combined-ca-bundle\") pod \"keystone-db-sync-h9swf\" (UID: \"615b1b59-cd92-4d09-bce0-5c3ee394a7b3\") " pod="openstack/keystone-db-sync-h9swf" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.623611 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/615b1b59-cd92-4d09-bce0-5c3ee394a7b3-config-data\") pod \"keystone-db-sync-h9swf\" (UID: \"615b1b59-cd92-4d09-bce0-5c3ee394a7b3\") " pod="openstack/keystone-db-sync-h9swf" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.627135 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnplv\" (UniqueName: \"kubernetes.io/projected/615b1b59-cd92-4d09-bce0-5c3ee394a7b3-kube-api-access-jnplv\") pod \"keystone-db-sync-h9swf\" (UID: \"615b1b59-cd92-4d09-bce0-5c3ee394a7b3\") " pod="openstack/keystone-db-sync-h9swf" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.628743 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcn7m\" (UniqueName: \"kubernetes.io/projected/d5e1c176-87f7-401f-9137-00d70f843212-kube-api-access-dcn7m\") pod \"neutron-db-create-btgc6\" (UID: \"d5e1c176-87f7-401f-9137-00d70f843212\") " pod="openstack/neutron-db-create-btgc6" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.677597 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-h9swf" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.685589 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-btgc6" Oct 01 12:52:38 crc kubenswrapper[4727]: I1001 12:52:38.788051 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-75b77"] Oct 01 12:52:38 crc kubenswrapper[4727]: W1001 12:52:38.811531 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdf7d4ae_6ebf_4bdb_9a09_8aae270477a9.slice/crio-e7d36d57a8314ddd6d202e64afd74cd08c3df6b75305b6fc69b6a17f18196c20 WatchSource:0}: Error finding container e7d36d57a8314ddd6d202e64afd74cd08c3df6b75305b6fc69b6a17f18196c20: Status 404 returned error can't find the container with id e7d36d57a8314ddd6d202e64afd74cd08c3df6b75305b6fc69b6a17f18196c20 Oct 01 12:52:39 crc kubenswrapper[4727]: I1001 12:52:39.444926 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-75b77" event={"ID":"fdf7d4ae-6ebf-4bdb-9a09-8aae270477a9","Type":"ContainerStarted","Data":"e7d36d57a8314ddd6d202e64afd74cd08c3df6b75305b6fc69b6a17f18196c20"} Oct 01 12:52:39 crc kubenswrapper[4727]: I1001 12:52:39.448137 4727 generic.go:334] "Generic (PLEG): container finished" podID="c81e4757-31e3-4edd-921c-324a8085be56" containerID="f07080fd9430a01d2c99c0c3de32e8c23556bc26cf9169584c1f40a6ac15656f" exitCode=0 Oct 01 12:52:39 crc kubenswrapper[4727]: I1001 12:52:39.448267 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v56sx-config-2b9ng" event={"ID":"c81e4757-31e3-4edd-921c-324a8085be56","Type":"ContainerDied","Data":"f07080fd9430a01d2c99c0c3de32e8c23556bc26cf9169584c1f40a6ac15656f"} Oct 01 12:52:39 crc kubenswrapper[4727]: I1001 12:52:39.538458 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-h9swf"] Oct 01 12:52:39 crc kubenswrapper[4727]: I1001 12:52:39.657139 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-cth6t"] Oct 01 12:52:39 crc kubenswrapper[4727]: I1001 12:52:39.674960 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-btgc6"] Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.458183 4727 generic.go:334] "Generic (PLEG): container finished" podID="d5e1c176-87f7-401f-9137-00d70f843212" containerID="610be50434a4cd2af19ada4d94fda4e0e0720676a36cd500c14d25fa3f0f9b69" exitCode=0 Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.458590 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-btgc6" event={"ID":"d5e1c176-87f7-401f-9137-00d70f843212","Type":"ContainerDied","Data":"610be50434a4cd2af19ada4d94fda4e0e0720676a36cd500c14d25fa3f0f9b69"} Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.458619 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-btgc6" event={"ID":"d5e1c176-87f7-401f-9137-00d70f843212","Type":"ContainerStarted","Data":"5a12ad49fb56b1a0fb49ea30362c933501dd7c2eccd27737d2374f36b56871b1"} Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.460815 4727 generic.go:334] "Generic (PLEG): container finished" podID="04b59d75-3cf2-451d-b303-07dac30964e5" containerID="08d10bd2e8da08a15291a224e9194caa4505321d5f0cbd297ac3f2d3f5561a80" exitCode=0 Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.460860 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cth6t" event={"ID":"04b59d75-3cf2-451d-b303-07dac30964e5","Type":"ContainerDied","Data":"08d10bd2e8da08a15291a224e9194caa4505321d5f0cbd297ac3f2d3f5561a80"} Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.460881 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cth6t" event={"ID":"04b59d75-3cf2-451d-b303-07dac30964e5","Type":"ContainerStarted","Data":"59f3fd962ff52d8e908621877e70c7d881e6d5f5affc40935b87d57a0faa00a6"} Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.462627 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-h9swf" event={"ID":"615b1b59-cd92-4d09-bce0-5c3ee394a7b3","Type":"ContainerStarted","Data":"641b759b3316997afe8f31bd6b35fa60bef7479c82af44ed3ccc376d7d8b0ca2"} Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.464295 4727 generic.go:334] "Generic (PLEG): container finished" podID="fdf7d4ae-6ebf-4bdb-9a09-8aae270477a9" containerID="f11dceb574f0003ce6d97d923ac3d0852a169bb4c1ddf18b53905f14898a4351" exitCode=0 Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.464342 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-75b77" event={"ID":"fdf7d4ae-6ebf-4bdb-9a09-8aae270477a9","Type":"ContainerDied","Data":"f11dceb574f0003ce6d97d923ac3d0852a169bb4c1ddf18b53905f14898a4351"} Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.468538 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d4f71a40-0089-4219-9ff4-837dfaf28b74","Type":"ContainerStarted","Data":"4e010b567f30fb233fb64d92d9e06bd6c6979096ca6f47b1f708c80e619ad1e3"} Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.468574 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d4f71a40-0089-4219-9ff4-837dfaf28b74","Type":"ContainerStarted","Data":"9cb9c5cc186fcce2db58848471ed45a3f25f3ca113fe1851a519450afe859af4"} Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.468586 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d4f71a40-0089-4219-9ff4-837dfaf28b74","Type":"ContainerStarted","Data":"d714ab4cde0013d5ff3bab91cdc4828832151e9652ba0c00d0a9c5119760da0f"} Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.468596 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d4f71a40-0089-4219-9ff4-837dfaf28b74","Type":"ContainerStarted","Data":"adb6cc7b782023856f995fe6e3019d5029edf30eb41a48f970f4e1c92b7bd072"} Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.468608 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d4f71a40-0089-4219-9ff4-837dfaf28b74","Type":"ContainerStarted","Data":"2edefd36f67e40719a1392b0d724b4d041596e80184c7b9f681162783edbc0ef"} Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.805194 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v56sx-config-2b9ng" Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.952758 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p22qj\" (UniqueName: \"kubernetes.io/projected/c81e4757-31e3-4edd-921c-324a8085be56-kube-api-access-p22qj\") pod \"c81e4757-31e3-4edd-921c-324a8085be56\" (UID: \"c81e4757-31e3-4edd-921c-324a8085be56\") " Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.952820 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c81e4757-31e3-4edd-921c-324a8085be56-var-run\") pod \"c81e4757-31e3-4edd-921c-324a8085be56\" (UID: \"c81e4757-31e3-4edd-921c-324a8085be56\") " Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.952853 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c81e4757-31e3-4edd-921c-324a8085be56-scripts\") pod \"c81e4757-31e3-4edd-921c-324a8085be56\" (UID: \"c81e4757-31e3-4edd-921c-324a8085be56\") " Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.952973 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c81e4757-31e3-4edd-921c-324a8085be56-additional-scripts\") pod \"c81e4757-31e3-4edd-921c-324a8085be56\" (UID: \"c81e4757-31e3-4edd-921c-324a8085be56\") " Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.953046 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c81e4757-31e3-4edd-921c-324a8085be56-var-log-ovn\") pod \"c81e4757-31e3-4edd-921c-324a8085be56\" (UID: \"c81e4757-31e3-4edd-921c-324a8085be56\") " Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.953170 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c81e4757-31e3-4edd-921c-324a8085be56-var-run-ovn\") pod \"c81e4757-31e3-4edd-921c-324a8085be56\" (UID: \"c81e4757-31e3-4edd-921c-324a8085be56\") " Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.953239 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c81e4757-31e3-4edd-921c-324a8085be56-var-run" (OuterVolumeSpecName: "var-run") pod "c81e4757-31e3-4edd-921c-324a8085be56" (UID: "c81e4757-31e3-4edd-921c-324a8085be56"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.953295 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c81e4757-31e3-4edd-921c-324a8085be56-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c81e4757-31e3-4edd-921c-324a8085be56" (UID: "c81e4757-31e3-4edd-921c-324a8085be56"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.953388 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c81e4757-31e3-4edd-921c-324a8085be56-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c81e4757-31e3-4edd-921c-324a8085be56" (UID: "c81e4757-31e3-4edd-921c-324a8085be56"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.953609 4727 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c81e4757-31e3-4edd-921c-324a8085be56-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.953628 4727 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c81e4757-31e3-4edd-921c-324a8085be56-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.953640 4727 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c81e4757-31e3-4edd-921c-324a8085be56-var-run\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.953694 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c81e4757-31e3-4edd-921c-324a8085be56-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c81e4757-31e3-4edd-921c-324a8085be56" (UID: "c81e4757-31e3-4edd-921c-324a8085be56"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.954121 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c81e4757-31e3-4edd-921c-324a8085be56-scripts" (OuterVolumeSpecName: "scripts") pod "c81e4757-31e3-4edd-921c-324a8085be56" (UID: "c81e4757-31e3-4edd-921c-324a8085be56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:52:40 crc kubenswrapper[4727]: I1001 12:52:40.960117 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c81e4757-31e3-4edd-921c-324a8085be56-kube-api-access-p22qj" (OuterVolumeSpecName: "kube-api-access-p22qj") pod "c81e4757-31e3-4edd-921c-324a8085be56" (UID: "c81e4757-31e3-4edd-921c-324a8085be56"). InnerVolumeSpecName "kube-api-access-p22qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.054883 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p22qj\" (UniqueName: \"kubernetes.io/projected/c81e4757-31e3-4edd-921c-324a8085be56-kube-api-access-p22qj\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.054915 4727 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c81e4757-31e3-4edd-921c-324a8085be56-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.054924 4727 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c81e4757-31e3-4edd-921c-324a8085be56-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.442460 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-w52kz"] Oct 01 12:52:41 crc kubenswrapper[4727]: E1001 12:52:41.443955 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c81e4757-31e3-4edd-921c-324a8085be56" containerName="ovn-config" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.443982 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="c81e4757-31e3-4edd-921c-324a8085be56" containerName="ovn-config" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.444210 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="c81e4757-31e3-4edd-921c-324a8085be56" containerName="ovn-config" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.446372 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w52kz" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.448189 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.448355 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-w7frw" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.452862 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-w52kz"] Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.494658 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d4f71a40-0089-4219-9ff4-837dfaf28b74","Type":"ContainerStarted","Data":"d67d24410c0cb1d8f968d7340ad44bda2970c30e4e89807064febb1bd83ecbba"} Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.494714 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d4f71a40-0089-4219-9ff4-837dfaf28b74","Type":"ContainerStarted","Data":"58b0ccc13b88c684e154f8e1982c0e340f8de00548a9a3279714dcaf9877bb06"} Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.497980 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-v56sx-config-2b9ng" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.498226 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-v56sx-config-2b9ng" event={"ID":"c81e4757-31e3-4edd-921c-324a8085be56","Type":"ContainerDied","Data":"5d1d80b24958ea052c6f12147e906e71e92f5f9abc43de3b34dbf8a5f63fd353"} Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.498265 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d1d80b24958ea052c6f12147e906e71e92f5f9abc43de3b34dbf8a5f63fd353" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.532909 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-v56sx-config-2b9ng"] Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.548573 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-v56sx-config-2b9ng"] Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.565486 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5cf7db27-ff87-481e-a776-cb171e57f4b9-db-sync-config-data\") pod \"glance-db-sync-w52kz\" (UID: \"5cf7db27-ff87-481e-a776-cb171e57f4b9\") " pod="openstack/glance-db-sync-w52kz" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.565576 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cf7db27-ff87-481e-a776-cb171e57f4b9-combined-ca-bundle\") pod \"glance-db-sync-w52kz\" (UID: \"5cf7db27-ff87-481e-a776-cb171e57f4b9\") " pod="openstack/glance-db-sync-w52kz" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.565595 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cf7db27-ff87-481e-a776-cb171e57f4b9-config-data\") pod \"glance-db-sync-w52kz\" (UID: \"5cf7db27-ff87-481e-a776-cb171e57f4b9\") " pod="openstack/glance-db-sync-w52kz" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.565648 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55gfl\" (UniqueName: \"kubernetes.io/projected/5cf7db27-ff87-481e-a776-cb171e57f4b9-kube-api-access-55gfl\") pod \"glance-db-sync-w52kz\" (UID: \"5cf7db27-ff87-481e-a776-cb171e57f4b9\") " pod="openstack/glance-db-sync-w52kz" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.667743 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55gfl\" (UniqueName: \"kubernetes.io/projected/5cf7db27-ff87-481e-a776-cb171e57f4b9-kube-api-access-55gfl\") pod \"glance-db-sync-w52kz\" (UID: \"5cf7db27-ff87-481e-a776-cb171e57f4b9\") " pod="openstack/glance-db-sync-w52kz" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.667815 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5cf7db27-ff87-481e-a776-cb171e57f4b9-db-sync-config-data\") pod \"glance-db-sync-w52kz\" (UID: \"5cf7db27-ff87-481e-a776-cb171e57f4b9\") " pod="openstack/glance-db-sync-w52kz" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.667891 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cf7db27-ff87-481e-a776-cb171e57f4b9-combined-ca-bundle\") pod \"glance-db-sync-w52kz\" (UID: \"5cf7db27-ff87-481e-a776-cb171e57f4b9\") " pod="openstack/glance-db-sync-w52kz" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.667910 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cf7db27-ff87-481e-a776-cb171e57f4b9-config-data\") pod \"glance-db-sync-w52kz\" (UID: \"5cf7db27-ff87-481e-a776-cb171e57f4b9\") " pod="openstack/glance-db-sync-w52kz" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.674639 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5cf7db27-ff87-481e-a776-cb171e57f4b9-db-sync-config-data\") pod \"glance-db-sync-w52kz\" (UID: \"5cf7db27-ff87-481e-a776-cb171e57f4b9\") " pod="openstack/glance-db-sync-w52kz" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.675833 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cf7db27-ff87-481e-a776-cb171e57f4b9-combined-ca-bundle\") pod \"glance-db-sync-w52kz\" (UID: \"5cf7db27-ff87-481e-a776-cb171e57f4b9\") " pod="openstack/glance-db-sync-w52kz" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.676236 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cf7db27-ff87-481e-a776-cb171e57f4b9-config-data\") pod \"glance-db-sync-w52kz\" (UID: \"5cf7db27-ff87-481e-a776-cb171e57f4b9\") " pod="openstack/glance-db-sync-w52kz" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.685958 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55gfl\" (UniqueName: \"kubernetes.io/projected/5cf7db27-ff87-481e-a776-cb171e57f4b9-kube-api-access-55gfl\") pod \"glance-db-sync-w52kz\" (UID: \"5cf7db27-ff87-481e-a776-cb171e57f4b9\") " pod="openstack/glance-db-sync-w52kz" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.761871 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w52kz" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.828896 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=21.20009136 podStartE2EDuration="29.828875137s" podCreationTimestamp="2025-10-01 12:52:12 +0000 UTC" firstStartedPulling="2025-10-01 12:52:30.427054694 +0000 UTC m=+928.748409531" lastFinishedPulling="2025-10-01 12:52:39.055838451 +0000 UTC m=+937.377193308" observedRunningTime="2025-10-01 12:52:41.553037415 +0000 UTC m=+939.874392272" watchObservedRunningTime="2025-10-01 12:52:41.828875137 +0000 UTC m=+940.150229994" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.842187 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-x2drr"] Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.843679 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.847212 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.863556 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cth6t" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.871272 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-x2drr"] Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.974574 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwkw4\" (UniqueName: \"kubernetes.io/projected/04b59d75-3cf2-451d-b303-07dac30964e5-kube-api-access-qwkw4\") pod \"04b59d75-3cf2-451d-b303-07dac30964e5\" (UID: \"04b59d75-3cf2-451d-b303-07dac30964e5\") " Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.975267 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-x2drr\" (UID: \"02e320cc-d38f-4f37-9353-395b869f907e\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.975323 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hnqr\" (UniqueName: \"kubernetes.io/projected/02e320cc-d38f-4f37-9353-395b869f907e-kube-api-access-7hnqr\") pod \"dnsmasq-dns-77585f5f8c-x2drr\" (UID: \"02e320cc-d38f-4f37-9353-395b869f907e\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.975397 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-config\") pod \"dnsmasq-dns-77585f5f8c-x2drr\" (UID: \"02e320cc-d38f-4f37-9353-395b869f907e\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.975429 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-x2drr\" (UID: \"02e320cc-d38f-4f37-9353-395b869f907e\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.975459 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-x2drr\" (UID: \"02e320cc-d38f-4f37-9353-395b869f907e\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.975543 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-x2drr\" (UID: \"02e320cc-d38f-4f37-9353-395b869f907e\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.982783 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04b59d75-3cf2-451d-b303-07dac30964e5-kube-api-access-qwkw4" (OuterVolumeSpecName: "kube-api-access-qwkw4") pod "04b59d75-3cf2-451d-b303-07dac30964e5" (UID: "04b59d75-3cf2-451d-b303-07dac30964e5"). InnerVolumeSpecName "kube-api-access-qwkw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:52:41 crc kubenswrapper[4727]: I1001 12:52:41.992301 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-btgc6" Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.024264 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-75b77" Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.078786 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-x2drr\" (UID: \"02e320cc-d38f-4f37-9353-395b869f907e\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.078933 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hnqr\" (UniqueName: \"kubernetes.io/projected/02e320cc-d38f-4f37-9353-395b869f907e-kube-api-access-7hnqr\") pod \"dnsmasq-dns-77585f5f8c-x2drr\" (UID: \"02e320cc-d38f-4f37-9353-395b869f907e\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.079765 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-config\") pod \"dnsmasq-dns-77585f5f8c-x2drr\" (UID: \"02e320cc-d38f-4f37-9353-395b869f907e\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.079806 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-x2drr\" (UID: \"02e320cc-d38f-4f37-9353-395b869f907e\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.080469 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-config\") pod \"dnsmasq-dns-77585f5f8c-x2drr\" (UID: \"02e320cc-d38f-4f37-9353-395b869f907e\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.080565 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-x2drr\" (UID: \"02e320cc-d38f-4f37-9353-395b869f907e\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.080622 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-x2drr\" (UID: \"02e320cc-d38f-4f37-9353-395b869f907e\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.081248 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-x2drr\" (UID: \"02e320cc-d38f-4f37-9353-395b869f907e\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.081386 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-x2drr\" (UID: \"02e320cc-d38f-4f37-9353-395b869f907e\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.082411 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-x2drr\" (UID: \"02e320cc-d38f-4f37-9353-395b869f907e\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.082576 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwkw4\" (UniqueName: \"kubernetes.io/projected/04b59d75-3cf2-451d-b303-07dac30964e5-kube-api-access-qwkw4\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.083086 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-x2drr\" (UID: \"02e320cc-d38f-4f37-9353-395b869f907e\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.099808 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hnqr\" (UniqueName: \"kubernetes.io/projected/02e320cc-d38f-4f37-9353-395b869f907e-kube-api-access-7hnqr\") pod \"dnsmasq-dns-77585f5f8c-x2drr\" (UID: \"02e320cc-d38f-4f37-9353-395b869f907e\") " pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.164695 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.183491 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcn7m\" (UniqueName: \"kubernetes.io/projected/d5e1c176-87f7-401f-9137-00d70f843212-kube-api-access-dcn7m\") pod \"d5e1c176-87f7-401f-9137-00d70f843212\" (UID: \"d5e1c176-87f7-401f-9137-00d70f843212\") " Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.183548 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq7dj\" (UniqueName: \"kubernetes.io/projected/fdf7d4ae-6ebf-4bdb-9a09-8aae270477a9-kube-api-access-kq7dj\") pod \"fdf7d4ae-6ebf-4bdb-9a09-8aae270477a9\" (UID: \"fdf7d4ae-6ebf-4bdb-9a09-8aae270477a9\") " Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.186190 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5e1c176-87f7-401f-9137-00d70f843212-kube-api-access-dcn7m" (OuterVolumeSpecName: "kube-api-access-dcn7m") pod "d5e1c176-87f7-401f-9137-00d70f843212" (UID: "d5e1c176-87f7-401f-9137-00d70f843212"). InnerVolumeSpecName "kube-api-access-dcn7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.186417 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdf7d4ae-6ebf-4bdb-9a09-8aae270477a9-kube-api-access-kq7dj" (OuterVolumeSpecName: "kube-api-access-kq7dj") pod "fdf7d4ae-6ebf-4bdb-9a09-8aae270477a9" (UID: "fdf7d4ae-6ebf-4bdb-9a09-8aae270477a9"). InnerVolumeSpecName "kube-api-access-kq7dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.284871 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kq7dj\" (UniqueName: \"kubernetes.io/projected/fdf7d4ae-6ebf-4bdb-9a09-8aae270477a9-kube-api-access-kq7dj\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.284930 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcn7m\" (UniqueName: \"kubernetes.io/projected/d5e1c176-87f7-401f-9137-00d70f843212-kube-api-access-dcn7m\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.397702 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c81e4757-31e3-4edd-921c-324a8085be56" path="/var/lib/kubelet/pods/c81e4757-31e3-4edd-921c-324a8085be56/volumes" Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.413412 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-w52kz"] Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.508286 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-75b77" event={"ID":"fdf7d4ae-6ebf-4bdb-9a09-8aae270477a9","Type":"ContainerDied","Data":"e7d36d57a8314ddd6d202e64afd74cd08c3df6b75305b6fc69b6a17f18196c20"} Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.508338 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7d36d57a8314ddd6d202e64afd74cd08c3df6b75305b6fc69b6a17f18196c20" Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.508426 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-75b77" Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.510733 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w52kz" event={"ID":"5cf7db27-ff87-481e-a776-cb171e57f4b9","Type":"ContainerStarted","Data":"5e62e258658704d926581be80de1e6988565f788ecc5485eb544031f8b66578f"} Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.512204 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-btgc6" event={"ID":"d5e1c176-87f7-401f-9137-00d70f843212","Type":"ContainerDied","Data":"5a12ad49fb56b1a0fb49ea30362c933501dd7c2eccd27737d2374f36b56871b1"} Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.512234 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a12ad49fb56b1a0fb49ea30362c933501dd7c2eccd27737d2374f36b56871b1" Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.512289 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-btgc6" Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.515309 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cth6t" event={"ID":"04b59d75-3cf2-451d-b303-07dac30964e5","Type":"ContainerDied","Data":"59f3fd962ff52d8e908621877e70c7d881e6d5f5affc40935b87d57a0faa00a6"} Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.515337 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59f3fd962ff52d8e908621877e70c7d881e6d5f5affc40935b87d57a0faa00a6" Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.515644 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cth6t" Oct 01 12:52:42 crc kubenswrapper[4727]: I1001 12:52:42.599227 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-x2drr"] Oct 01 12:52:44 crc kubenswrapper[4727]: W1001 12:52:44.784693 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02e320cc_d38f_4f37_9353_395b869f907e.slice/crio-bf5f36e3ff4cfd07e908a91326eed2973cfc1dec33663ec50451d6eac5adb746 WatchSource:0}: Error finding container bf5f36e3ff4cfd07e908a91326eed2973cfc1dec33663ec50451d6eac5adb746: Status 404 returned error can't find the container with id bf5f36e3ff4cfd07e908a91326eed2973cfc1dec33663ec50451d6eac5adb746 Oct 01 12:52:45 crc kubenswrapper[4727]: I1001 12:52:45.547807 4727 generic.go:334] "Generic (PLEG): container finished" podID="02e320cc-d38f-4f37-9353-395b869f907e" containerID="6201cdb8b3bdfcf391a403999d27d7f25314bdbe3ad2b808f288dcf3e3ff2363" exitCode=0 Oct 01 12:52:45 crc kubenswrapper[4727]: I1001 12:52:45.548195 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" event={"ID":"02e320cc-d38f-4f37-9353-395b869f907e","Type":"ContainerDied","Data":"6201cdb8b3bdfcf391a403999d27d7f25314bdbe3ad2b808f288dcf3e3ff2363"} Oct 01 12:52:45 crc kubenswrapper[4727]: I1001 12:52:45.548222 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" event={"ID":"02e320cc-d38f-4f37-9353-395b869f907e","Type":"ContainerStarted","Data":"bf5f36e3ff4cfd07e908a91326eed2973cfc1dec33663ec50451d6eac5adb746"} Oct 01 12:52:45 crc kubenswrapper[4727]: I1001 12:52:45.551098 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-h9swf" event={"ID":"615b1b59-cd92-4d09-bce0-5c3ee394a7b3","Type":"ContainerStarted","Data":"bf23e0eabc43f46342e3bb785ad57dc13f4253b398318a5778337cf75e0573a1"} Oct 01 12:52:45 crc kubenswrapper[4727]: I1001 12:52:45.602348 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-h9swf" podStartSLOduration=2.301674992 podStartE2EDuration="7.602326929s" podCreationTimestamp="2025-10-01 12:52:38 +0000 UTC" firstStartedPulling="2025-10-01 12:52:39.555553279 +0000 UTC m=+937.876908116" lastFinishedPulling="2025-10-01 12:52:44.856205206 +0000 UTC m=+943.177560053" observedRunningTime="2025-10-01 12:52:45.60142172 +0000 UTC m=+943.922776557" watchObservedRunningTime="2025-10-01 12:52:45.602326929 +0000 UTC m=+943.923681786" Oct 01 12:52:46 crc kubenswrapper[4727]: I1001 12:52:46.560952 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" event={"ID":"02e320cc-d38f-4f37-9353-395b869f907e","Type":"ContainerStarted","Data":"d399556aa9b4c4b2c56dbe148a07ce603cd692ca35c4e36ff9c74d71683c1ab9"} Oct 01 12:52:46 crc kubenswrapper[4727]: I1001 12:52:46.582618 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" podStartSLOduration=5.5826009899999995 podStartE2EDuration="5.58260099s" podCreationTimestamp="2025-10-01 12:52:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:52:46.578508761 +0000 UTC m=+944.899863598" watchObservedRunningTime="2025-10-01 12:52:46.58260099 +0000 UTC m=+944.903955827" Oct 01 12:52:47 crc kubenswrapper[4727]: I1001 12:52:47.165925 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" Oct 01 12:52:48 crc kubenswrapper[4727]: I1001 12:52:48.050419 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-ce84-account-create-7bvhl"] Oct 01 12:52:48 crc kubenswrapper[4727]: E1001 12:52:48.051068 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e1c176-87f7-401f-9137-00d70f843212" containerName="mariadb-database-create" Oct 01 12:52:48 crc kubenswrapper[4727]: I1001 12:52:48.051082 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e1c176-87f7-401f-9137-00d70f843212" containerName="mariadb-database-create" Oct 01 12:52:48 crc kubenswrapper[4727]: E1001 12:52:48.051107 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b59d75-3cf2-451d-b303-07dac30964e5" containerName="mariadb-database-create" Oct 01 12:52:48 crc kubenswrapper[4727]: I1001 12:52:48.051114 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b59d75-3cf2-451d-b303-07dac30964e5" containerName="mariadb-database-create" Oct 01 12:52:48 crc kubenswrapper[4727]: E1001 12:52:48.051123 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdf7d4ae-6ebf-4bdb-9a09-8aae270477a9" containerName="mariadb-database-create" Oct 01 12:52:48 crc kubenswrapper[4727]: I1001 12:52:48.051129 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdf7d4ae-6ebf-4bdb-9a09-8aae270477a9" containerName="mariadb-database-create" Oct 01 12:52:48 crc kubenswrapper[4727]: I1001 12:52:48.051274 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="04b59d75-3cf2-451d-b303-07dac30964e5" containerName="mariadb-database-create" Oct 01 12:52:48 crc kubenswrapper[4727]: I1001 12:52:48.051286 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5e1c176-87f7-401f-9137-00d70f843212" containerName="mariadb-database-create" Oct 01 12:52:48 crc kubenswrapper[4727]: I1001 12:52:48.051307 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdf7d4ae-6ebf-4bdb-9a09-8aae270477a9" containerName="mariadb-database-create" Oct 01 12:52:48 crc kubenswrapper[4727]: I1001 12:52:48.051776 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ce84-account-create-7bvhl" Oct 01 12:52:48 crc kubenswrapper[4727]: I1001 12:52:48.053978 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 01 12:52:48 crc kubenswrapper[4727]: I1001 12:52:48.063590 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ce84-account-create-7bvhl"] Oct 01 12:52:48 crc kubenswrapper[4727]: I1001 12:52:48.187962 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmc4n\" (UniqueName: \"kubernetes.io/projected/9a571990-0e3c-4dc3-805f-5620123cca26-kube-api-access-pmc4n\") pod \"barbican-ce84-account-create-7bvhl\" (UID: \"9a571990-0e3c-4dc3-805f-5620123cca26\") " pod="openstack/barbican-ce84-account-create-7bvhl" Oct 01 12:52:48 crc kubenswrapper[4727]: I1001 12:52:48.259755 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-22e1-account-create-rxdnh"] Oct 01 12:52:48 crc kubenswrapper[4727]: I1001 12:52:48.261276 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-22e1-account-create-rxdnh" Oct 01 12:52:48 crc kubenswrapper[4727]: I1001 12:52:48.263750 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 01 12:52:48 crc kubenswrapper[4727]: I1001 12:52:48.268342 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-22e1-account-create-rxdnh"] Oct 01 12:52:48 crc kubenswrapper[4727]: I1001 12:52:48.289562 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmc4n\" (UniqueName: \"kubernetes.io/projected/9a571990-0e3c-4dc3-805f-5620123cca26-kube-api-access-pmc4n\") pod \"barbican-ce84-account-create-7bvhl\" (UID: \"9a571990-0e3c-4dc3-805f-5620123cca26\") " pod="openstack/barbican-ce84-account-create-7bvhl" Oct 01 12:52:48 crc kubenswrapper[4727]: I1001 12:52:48.324136 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmc4n\" (UniqueName: \"kubernetes.io/projected/9a571990-0e3c-4dc3-805f-5620123cca26-kube-api-access-pmc4n\") pod \"barbican-ce84-account-create-7bvhl\" (UID: \"9a571990-0e3c-4dc3-805f-5620123cca26\") " pod="openstack/barbican-ce84-account-create-7bvhl" Oct 01 12:52:48 crc kubenswrapper[4727]: I1001 12:52:48.378641 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ce84-account-create-7bvhl" Oct 01 12:52:48 crc kubenswrapper[4727]: I1001 12:52:48.392089 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7vl5\" (UniqueName: \"kubernetes.io/projected/d7255c8b-01a2-4ab2-82cf-1480602a1083-kube-api-access-g7vl5\") pod \"neutron-22e1-account-create-rxdnh\" (UID: \"d7255c8b-01a2-4ab2-82cf-1480602a1083\") " pod="openstack/neutron-22e1-account-create-rxdnh" Oct 01 12:52:48 crc kubenswrapper[4727]: I1001 12:52:48.493918 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7vl5\" (UniqueName: \"kubernetes.io/projected/d7255c8b-01a2-4ab2-82cf-1480602a1083-kube-api-access-g7vl5\") pod \"neutron-22e1-account-create-rxdnh\" (UID: \"d7255c8b-01a2-4ab2-82cf-1480602a1083\") " pod="openstack/neutron-22e1-account-create-rxdnh" Oct 01 12:52:48 crc kubenswrapper[4727]: I1001 12:52:48.513837 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7vl5\" (UniqueName: \"kubernetes.io/projected/d7255c8b-01a2-4ab2-82cf-1480602a1083-kube-api-access-g7vl5\") pod \"neutron-22e1-account-create-rxdnh\" (UID: \"d7255c8b-01a2-4ab2-82cf-1480602a1083\") " pod="openstack/neutron-22e1-account-create-rxdnh" Oct 01 12:52:48 crc kubenswrapper[4727]: I1001 12:52:48.580046 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-22e1-account-create-rxdnh" Oct 01 12:52:48 crc kubenswrapper[4727]: I1001 12:52:48.582534 4727 generic.go:334] "Generic (PLEG): container finished" podID="615b1b59-cd92-4d09-bce0-5c3ee394a7b3" containerID="bf23e0eabc43f46342e3bb785ad57dc13f4253b398318a5778337cf75e0573a1" exitCode=0 Oct 01 12:52:48 crc kubenswrapper[4727]: I1001 12:52:48.582616 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-h9swf" event={"ID":"615b1b59-cd92-4d09-bce0-5c3ee394a7b3","Type":"ContainerDied","Data":"bf23e0eabc43f46342e3bb785ad57dc13f4253b398318a5778337cf75e0573a1"} Oct 01 12:52:49 crc kubenswrapper[4727]: I1001 12:52:48.842112 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ce84-account-create-7bvhl"] Oct 01 12:52:49 crc kubenswrapper[4727]: I1001 12:52:49.597332 4727 generic.go:334] "Generic (PLEG): container finished" podID="9a571990-0e3c-4dc3-805f-5620123cca26" containerID="784f7336b16d43eac26b41ecc990efe969652be25251ea9d9cb7fba26323dcb8" exitCode=0 Oct 01 12:52:49 crc kubenswrapper[4727]: I1001 12:52:49.597489 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ce84-account-create-7bvhl" event={"ID":"9a571990-0e3c-4dc3-805f-5620123cca26","Type":"ContainerDied","Data":"784f7336b16d43eac26b41ecc990efe969652be25251ea9d9cb7fba26323dcb8"} Oct 01 12:52:49 crc kubenswrapper[4727]: I1001 12:52:49.597929 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ce84-account-create-7bvhl" event={"ID":"9a571990-0e3c-4dc3-805f-5620123cca26","Type":"ContainerStarted","Data":"39d2c4674ebe6c4f3aadc0e35edd3e9835fa3794a92eca51dcd952d1e73a654a"} Oct 01 12:52:49 crc kubenswrapper[4727]: I1001 12:52:49.750338 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-22e1-account-create-rxdnh"] Oct 01 12:52:52 crc kubenswrapper[4727]: I1001 12:52:52.167538 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" Oct 01 12:52:52 crc kubenswrapper[4727]: I1001 12:52:52.221737 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-j2xww"] Oct 01 12:52:52 crc kubenswrapper[4727]: I1001 12:52:52.222701 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-j2xww" podUID="fa41b431-40d7-43db-a5a0-d05552cda2d1" containerName="dnsmasq-dns" containerID="cri-o://efe72741183f4e53e906a8d21a4524dd4a11d50d21ac582f95be94e343ad82bc" gracePeriod=10 Oct 01 12:52:52 crc kubenswrapper[4727]: I1001 12:52:52.625654 4727 generic.go:334] "Generic (PLEG): container finished" podID="fa41b431-40d7-43db-a5a0-d05552cda2d1" containerID="efe72741183f4e53e906a8d21a4524dd4a11d50d21ac582f95be94e343ad82bc" exitCode=0 Oct 01 12:52:52 crc kubenswrapper[4727]: I1001 12:52:52.625694 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-j2xww" event={"ID":"fa41b431-40d7-43db-a5a0-d05552cda2d1","Type":"ContainerDied","Data":"efe72741183f4e53e906a8d21a4524dd4a11d50d21ac582f95be94e343ad82bc"} Oct 01 12:52:53 crc kubenswrapper[4727]: I1001 12:52:53.087781 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-j2xww" podUID="fa41b431-40d7-43db-a5a0-d05552cda2d1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: connect: connection refused" Oct 01 12:52:55 crc kubenswrapper[4727]: W1001 12:52:55.576599 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7255c8b_01a2_4ab2_82cf_1480602a1083.slice/crio-839d22cc3b2d863a8ee08f0f1bec94fc1a7c8a9184ce8e609aa0dbfb1e848247 WatchSource:0}: Error finding container 839d22cc3b2d863a8ee08f0f1bec94fc1a7c8a9184ce8e609aa0dbfb1e848247: Status 404 returned error can't find the container with id 839d22cc3b2d863a8ee08f0f1bec94fc1a7c8a9184ce8e609aa0dbfb1e848247 Oct 01 12:52:55 crc kubenswrapper[4727]: I1001 12:52:55.664776 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-22e1-account-create-rxdnh" event={"ID":"d7255c8b-01a2-4ab2-82cf-1480602a1083","Type":"ContainerStarted","Data":"839d22cc3b2d863a8ee08f0f1bec94fc1a7c8a9184ce8e609aa0dbfb1e848247"} Oct 01 12:52:55 crc kubenswrapper[4727]: I1001 12:52:55.667801 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-h9swf" event={"ID":"615b1b59-cd92-4d09-bce0-5c3ee394a7b3","Type":"ContainerDied","Data":"641b759b3316997afe8f31bd6b35fa60bef7479c82af44ed3ccc376d7d8b0ca2"} Oct 01 12:52:55 crc kubenswrapper[4727]: I1001 12:52:55.667846 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="641b759b3316997afe8f31bd6b35fa60bef7479c82af44ed3ccc376d7d8b0ca2" Oct 01 12:52:55 crc kubenswrapper[4727]: I1001 12:52:55.669301 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ce84-account-create-7bvhl" event={"ID":"9a571990-0e3c-4dc3-805f-5620123cca26","Type":"ContainerDied","Data":"39d2c4674ebe6c4f3aadc0e35edd3e9835fa3794a92eca51dcd952d1e73a654a"} Oct 01 12:52:55 crc kubenswrapper[4727]: I1001 12:52:55.669350 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39d2c4674ebe6c4f3aadc0e35edd3e9835fa3794a92eca51dcd952d1e73a654a" Oct 01 12:52:55 crc kubenswrapper[4727]: I1001 12:52:55.681408 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-h9swf" Oct 01 12:52:55 crc kubenswrapper[4727]: I1001 12:52:55.711803 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615b1b59-cd92-4d09-bce0-5c3ee394a7b3-combined-ca-bundle\") pod \"615b1b59-cd92-4d09-bce0-5c3ee394a7b3\" (UID: \"615b1b59-cd92-4d09-bce0-5c3ee394a7b3\") " Oct 01 12:52:55 crc kubenswrapper[4727]: I1001 12:52:55.712018 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/615b1b59-cd92-4d09-bce0-5c3ee394a7b3-config-data\") pod \"615b1b59-cd92-4d09-bce0-5c3ee394a7b3\" (UID: \"615b1b59-cd92-4d09-bce0-5c3ee394a7b3\") " Oct 01 12:52:55 crc kubenswrapper[4727]: I1001 12:52:55.720710 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ce84-account-create-7bvhl" Oct 01 12:52:55 crc kubenswrapper[4727]: I1001 12:52:55.744380 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/615b1b59-cd92-4d09-bce0-5c3ee394a7b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "615b1b59-cd92-4d09-bce0-5c3ee394a7b3" (UID: "615b1b59-cd92-4d09-bce0-5c3ee394a7b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:52:55 crc kubenswrapper[4727]: I1001 12:52:55.768622 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/615b1b59-cd92-4d09-bce0-5c3ee394a7b3-config-data" (OuterVolumeSpecName: "config-data") pod "615b1b59-cd92-4d09-bce0-5c3ee394a7b3" (UID: "615b1b59-cd92-4d09-bce0-5c3ee394a7b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:52:55 crc kubenswrapper[4727]: I1001 12:52:55.820507 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnplv\" (UniqueName: \"kubernetes.io/projected/615b1b59-cd92-4d09-bce0-5c3ee394a7b3-kube-api-access-jnplv\") pod \"615b1b59-cd92-4d09-bce0-5c3ee394a7b3\" (UID: \"615b1b59-cd92-4d09-bce0-5c3ee394a7b3\") " Oct 01 12:52:55 crc kubenswrapper[4727]: I1001 12:52:55.820619 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmc4n\" (UniqueName: \"kubernetes.io/projected/9a571990-0e3c-4dc3-805f-5620123cca26-kube-api-access-pmc4n\") pod \"9a571990-0e3c-4dc3-805f-5620123cca26\" (UID: \"9a571990-0e3c-4dc3-805f-5620123cca26\") " Oct 01 12:52:55 crc kubenswrapper[4727]: I1001 12:52:55.821025 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/615b1b59-cd92-4d09-bce0-5c3ee394a7b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:55 crc kubenswrapper[4727]: I1001 12:52:55.821037 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/615b1b59-cd92-4d09-bce0-5c3ee394a7b3-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:55 crc kubenswrapper[4727]: I1001 12:52:55.824815 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a571990-0e3c-4dc3-805f-5620123cca26-kube-api-access-pmc4n" (OuterVolumeSpecName: "kube-api-access-pmc4n") pod "9a571990-0e3c-4dc3-805f-5620123cca26" (UID: "9a571990-0e3c-4dc3-805f-5620123cca26"). InnerVolumeSpecName "kube-api-access-pmc4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:52:55 crc kubenswrapper[4727]: I1001 12:52:55.824869 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/615b1b59-cd92-4d09-bce0-5c3ee394a7b3-kube-api-access-jnplv" (OuterVolumeSpecName: "kube-api-access-jnplv") pod "615b1b59-cd92-4d09-bce0-5c3ee394a7b3" (UID: "615b1b59-cd92-4d09-bce0-5c3ee394a7b3"). InnerVolumeSpecName "kube-api-access-jnplv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:52:55 crc kubenswrapper[4727]: I1001 12:52:55.892208 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-j2xww" Oct 01 12:52:55 crc kubenswrapper[4727]: I1001 12:52:55.922423 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnplv\" (UniqueName: \"kubernetes.io/projected/615b1b59-cd92-4d09-bce0-5c3ee394a7b3-kube-api-access-jnplv\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:55 crc kubenswrapper[4727]: I1001 12:52:55.922476 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmc4n\" (UniqueName: \"kubernetes.io/projected/9a571990-0e3c-4dc3-805f-5620123cca26-kube-api-access-pmc4n\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.023232 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa41b431-40d7-43db-a5a0-d05552cda2d1-config\") pod \"fa41b431-40d7-43db-a5a0-d05552cda2d1\" (UID: \"fa41b431-40d7-43db-a5a0-d05552cda2d1\") " Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.023296 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cskvh\" (UniqueName: \"kubernetes.io/projected/fa41b431-40d7-43db-a5a0-d05552cda2d1-kube-api-access-cskvh\") pod \"fa41b431-40d7-43db-a5a0-d05552cda2d1\" (UID: \"fa41b431-40d7-43db-a5a0-d05552cda2d1\") " Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.023365 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa41b431-40d7-43db-a5a0-d05552cda2d1-ovsdbserver-sb\") pod \"fa41b431-40d7-43db-a5a0-d05552cda2d1\" (UID: \"fa41b431-40d7-43db-a5a0-d05552cda2d1\") " Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.023392 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa41b431-40d7-43db-a5a0-d05552cda2d1-dns-svc\") pod \"fa41b431-40d7-43db-a5a0-d05552cda2d1\" (UID: \"fa41b431-40d7-43db-a5a0-d05552cda2d1\") " Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.023497 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa41b431-40d7-43db-a5a0-d05552cda2d1-ovsdbserver-nb\") pod \"fa41b431-40d7-43db-a5a0-d05552cda2d1\" (UID: \"fa41b431-40d7-43db-a5a0-d05552cda2d1\") " Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.027185 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa41b431-40d7-43db-a5a0-d05552cda2d1-kube-api-access-cskvh" (OuterVolumeSpecName: "kube-api-access-cskvh") pod "fa41b431-40d7-43db-a5a0-d05552cda2d1" (UID: "fa41b431-40d7-43db-a5a0-d05552cda2d1"). InnerVolumeSpecName "kube-api-access-cskvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.068319 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa41b431-40d7-43db-a5a0-d05552cda2d1-config" (OuterVolumeSpecName: "config") pod "fa41b431-40d7-43db-a5a0-d05552cda2d1" (UID: "fa41b431-40d7-43db-a5a0-d05552cda2d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.070355 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa41b431-40d7-43db-a5a0-d05552cda2d1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fa41b431-40d7-43db-a5a0-d05552cda2d1" (UID: "fa41b431-40d7-43db-a5a0-d05552cda2d1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.070712 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa41b431-40d7-43db-a5a0-d05552cda2d1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fa41b431-40d7-43db-a5a0-d05552cda2d1" (UID: "fa41b431-40d7-43db-a5a0-d05552cda2d1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.072588 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa41b431-40d7-43db-a5a0-d05552cda2d1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fa41b431-40d7-43db-a5a0-d05552cda2d1" (UID: "fa41b431-40d7-43db-a5a0-d05552cda2d1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.125726 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa41b431-40d7-43db-a5a0-d05552cda2d1-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.125770 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cskvh\" (UniqueName: \"kubernetes.io/projected/fa41b431-40d7-43db-a5a0-d05552cda2d1-kube-api-access-cskvh\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.125782 4727 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa41b431-40d7-43db-a5a0-d05552cda2d1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.125790 4727 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa41b431-40d7-43db-a5a0-d05552cda2d1-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.125798 4727 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa41b431-40d7-43db-a5a0-d05552cda2d1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.693539 4727 generic.go:334] "Generic (PLEG): container finished" podID="d7255c8b-01a2-4ab2-82cf-1480602a1083" containerID="016eac71034346e62c68c783917ba4a1143cc1edc129602bba128f742185407c" exitCode=0 Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.693893 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-22e1-account-create-rxdnh" event={"ID":"d7255c8b-01a2-4ab2-82cf-1480602a1083","Type":"ContainerDied","Data":"016eac71034346e62c68c783917ba4a1143cc1edc129602bba128f742185407c"} Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.701758 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w52kz" event={"ID":"5cf7db27-ff87-481e-a776-cb171e57f4b9","Type":"ContainerStarted","Data":"d01dbe5e7d3fa66afa6e2d63083986a5682b9751e62e8d9115fb1e5fe0317a27"} Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.711675 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-h9swf" Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.712307 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-j2xww" Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.712628 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-j2xww" event={"ID":"fa41b431-40d7-43db-a5a0-d05552cda2d1","Type":"ContainerDied","Data":"dd42968034e535b441249f4b4484bfed4a9fc0a5ac6772e5220d695d21b45ce2"} Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.712663 4727 scope.go:117] "RemoveContainer" containerID="efe72741183f4e53e906a8d21a4524dd4a11d50d21ac582f95be94e343ad82bc" Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.712764 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ce84-account-create-7bvhl" Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.742215 4727 scope.go:117] "RemoveContainer" containerID="0a063a6ae0e10474a4db8dabe6601c2ec7bac7eb5e98703dc6e80593d3d9ffe5" Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.740146 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-w52kz" podStartSLOduration=2.441648679 podStartE2EDuration="15.740122561s" podCreationTimestamp="2025-10-01 12:52:41 +0000 UTC" firstStartedPulling="2025-10-01 12:52:42.423621159 +0000 UTC m=+940.744975986" lastFinishedPulling="2025-10-01 12:52:55.722095031 +0000 UTC m=+954.043449868" observedRunningTime="2025-10-01 12:52:56.723524218 +0000 UTC m=+955.044879075" watchObservedRunningTime="2025-10-01 12:52:56.740122561 +0000 UTC m=+955.061477398" Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.761351 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-j2xww"] Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.769672 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-j2xww"] Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.930367 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-tcpzp"] Oct 01 12:52:56 crc kubenswrapper[4727]: E1001 12:52:56.930762 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a571990-0e3c-4dc3-805f-5620123cca26" containerName="mariadb-account-create" Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.930778 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a571990-0e3c-4dc3-805f-5620123cca26" containerName="mariadb-account-create" Oct 01 12:52:56 crc kubenswrapper[4727]: E1001 12:52:56.930794 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615b1b59-cd92-4d09-bce0-5c3ee394a7b3" containerName="keystone-db-sync" Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.930801 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="615b1b59-cd92-4d09-bce0-5c3ee394a7b3" containerName="keystone-db-sync" Oct 01 12:52:56 crc kubenswrapper[4727]: E1001 12:52:56.930815 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa41b431-40d7-43db-a5a0-d05552cda2d1" containerName="init" Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.930822 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa41b431-40d7-43db-a5a0-d05552cda2d1" containerName="init" Oct 01 12:52:56 crc kubenswrapper[4727]: E1001 12:52:56.930843 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa41b431-40d7-43db-a5a0-d05552cda2d1" containerName="dnsmasq-dns" Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.930850 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa41b431-40d7-43db-a5a0-d05552cda2d1" containerName="dnsmasq-dns" Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.931062 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa41b431-40d7-43db-a5a0-d05552cda2d1" containerName="dnsmasq-dns" Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.931081 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="615b1b59-cd92-4d09-bce0-5c3ee394a7b3" containerName="keystone-db-sync" Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.931109 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a571990-0e3c-4dc3-805f-5620123cca26" containerName="mariadb-account-create" Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.932556 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-tcpzp" Oct 01 12:52:56 crc kubenswrapper[4727]: I1001 12:52:56.958799 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-tcpzp"] Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.004309 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-s7wzj"] Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.008138 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7wzj" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.012472 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.012736 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jxtdc" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.012926 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.013089 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.028393 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-s7wzj"] Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.046020 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrxw9\" (UniqueName: \"kubernetes.io/projected/9640282e-9dc9-4240-ad29-599e5498278d-kube-api-access-rrxw9\") pod \"dnsmasq-dns-55fff446b9-tcpzp\" (UID: \"9640282e-9dc9-4240-ad29-599e5498278d\") " pod="openstack/dnsmasq-dns-55fff446b9-tcpzp" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.046075 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-dns-svc\") pod \"dnsmasq-dns-55fff446b9-tcpzp\" (UID: \"9640282e-9dc9-4240-ad29-599e5498278d\") " pod="openstack/dnsmasq-dns-55fff446b9-tcpzp" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.046104 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-tcpzp\" (UID: \"9640282e-9dc9-4240-ad29-599e5498278d\") " pod="openstack/dnsmasq-dns-55fff446b9-tcpzp" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.046195 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-tcpzp\" (UID: \"9640282e-9dc9-4240-ad29-599e5498278d\") " pod="openstack/dnsmasq-dns-55fff446b9-tcpzp" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.046238 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-tcpzp\" (UID: \"9640282e-9dc9-4240-ad29-599e5498278d\") " pod="openstack/dnsmasq-dns-55fff446b9-tcpzp" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.046257 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-config\") pod \"dnsmasq-dns-55fff446b9-tcpzp\" (UID: \"9640282e-9dc9-4240-ad29-599e5498278d\") " pod="openstack/dnsmasq-dns-55fff446b9-tcpzp" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.147472 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-tcpzp\" (UID: \"9640282e-9dc9-4240-ad29-599e5498278d\") " pod="openstack/dnsmasq-dns-55fff446b9-tcpzp" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.147515 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-config\") pod \"dnsmasq-dns-55fff446b9-tcpzp\" (UID: \"9640282e-9dc9-4240-ad29-599e5498278d\") " pod="openstack/dnsmasq-dns-55fff446b9-tcpzp" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.147553 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrxw9\" (UniqueName: \"kubernetes.io/projected/9640282e-9dc9-4240-ad29-599e5498278d-kube-api-access-rrxw9\") pod \"dnsmasq-dns-55fff446b9-tcpzp\" (UID: \"9640282e-9dc9-4240-ad29-599e5498278d\") " pod="openstack/dnsmasq-dns-55fff446b9-tcpzp" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.147584 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-dns-svc\") pod \"dnsmasq-dns-55fff446b9-tcpzp\" (UID: \"9640282e-9dc9-4240-ad29-599e5498278d\") " pod="openstack/dnsmasq-dns-55fff446b9-tcpzp" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.147737 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-credential-keys\") pod \"keystone-bootstrap-s7wzj\" (UID: \"9bd17d59-909c-4109-9063-f7f21106137b\") " pod="openstack/keystone-bootstrap-s7wzj" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.147816 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-tcpzp\" (UID: \"9640282e-9dc9-4240-ad29-599e5498278d\") " pod="openstack/dnsmasq-dns-55fff446b9-tcpzp" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.147881 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-config-data\") pod \"keystone-bootstrap-s7wzj\" (UID: \"9bd17d59-909c-4109-9063-f7f21106137b\") " pod="openstack/keystone-bootstrap-s7wzj" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.147915 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-scripts\") pod \"keystone-bootstrap-s7wzj\" (UID: \"9bd17d59-909c-4109-9063-f7f21106137b\") " pod="openstack/keystone-bootstrap-s7wzj" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.148035 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mm5z\" (UniqueName: \"kubernetes.io/projected/9bd17d59-909c-4109-9063-f7f21106137b-kube-api-access-6mm5z\") pod \"keystone-bootstrap-s7wzj\" (UID: \"9bd17d59-909c-4109-9063-f7f21106137b\") " pod="openstack/keystone-bootstrap-s7wzj" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.148155 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-tcpzp\" (UID: \"9640282e-9dc9-4240-ad29-599e5498278d\") " pod="openstack/dnsmasq-dns-55fff446b9-tcpzp" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.148198 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-combined-ca-bundle\") pod \"keystone-bootstrap-s7wzj\" (UID: \"9bd17d59-909c-4109-9063-f7f21106137b\") " pod="openstack/keystone-bootstrap-s7wzj" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.148228 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-fernet-keys\") pod \"keystone-bootstrap-s7wzj\" (UID: \"9bd17d59-909c-4109-9063-f7f21106137b\") " pod="openstack/keystone-bootstrap-s7wzj" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.148571 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-dns-svc\") pod \"dnsmasq-dns-55fff446b9-tcpzp\" (UID: \"9640282e-9dc9-4240-ad29-599e5498278d\") " pod="openstack/dnsmasq-dns-55fff446b9-tcpzp" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.148663 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-tcpzp\" (UID: \"9640282e-9dc9-4240-ad29-599e5498278d\") " pod="openstack/dnsmasq-dns-55fff446b9-tcpzp" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.148693 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-tcpzp\" (UID: \"9640282e-9dc9-4240-ad29-599e5498278d\") " pod="openstack/dnsmasq-dns-55fff446b9-tcpzp" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.149116 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-tcpzp\" (UID: \"9640282e-9dc9-4240-ad29-599e5498278d\") " pod="openstack/dnsmasq-dns-55fff446b9-tcpzp" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.149399 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-config\") pod \"dnsmasq-dns-55fff446b9-tcpzp\" (UID: \"9640282e-9dc9-4240-ad29-599e5498278d\") " pod="openstack/dnsmasq-dns-55fff446b9-tcpzp" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.154257 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.156654 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.159719 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.159906 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.178831 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.186134 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrxw9\" (UniqueName: \"kubernetes.io/projected/9640282e-9dc9-4240-ad29-599e5498278d-kube-api-access-rrxw9\") pod \"dnsmasq-dns-55fff446b9-tcpzp\" (UID: \"9640282e-9dc9-4240-ad29-599e5498278d\") " pod="openstack/dnsmasq-dns-55fff446b9-tcpzp" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.249474 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-tcpzp" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.249923 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-combined-ca-bundle\") pod \"keystone-bootstrap-s7wzj\" (UID: \"9bd17d59-909c-4109-9063-f7f21106137b\") " pod="openstack/keystone-bootstrap-s7wzj" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.249966 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-fernet-keys\") pod \"keystone-bootstrap-s7wzj\" (UID: \"9bd17d59-909c-4109-9063-f7f21106137b\") " pod="openstack/keystone-bootstrap-s7wzj" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.249990 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c1560a-bd40-490a-8d86-a71b9a34b7ea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " pod="openstack/ceilometer-0" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.250039 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25c1560a-bd40-490a-8d86-a71b9a34b7ea-run-httpd\") pod \"ceilometer-0\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " pod="openstack/ceilometer-0" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.250090 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25c1560a-bd40-490a-8d86-a71b9a34b7ea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " pod="openstack/ceilometer-0" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.250119 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-credential-keys\") pod \"keystone-bootstrap-s7wzj\" (UID: \"9bd17d59-909c-4109-9063-f7f21106137b\") " pod="openstack/keystone-bootstrap-s7wzj" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.250160 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25c1560a-bd40-490a-8d86-a71b9a34b7ea-log-httpd\") pod \"ceilometer-0\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " pod="openstack/ceilometer-0" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.250187 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-config-data\") pod \"keystone-bootstrap-s7wzj\" (UID: \"9bd17d59-909c-4109-9063-f7f21106137b\") " pod="openstack/keystone-bootstrap-s7wzj" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.250209 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-scripts\") pod \"keystone-bootstrap-s7wzj\" (UID: \"9bd17d59-909c-4109-9063-f7f21106137b\") " pod="openstack/keystone-bootstrap-s7wzj" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.250264 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mm5z\" (UniqueName: \"kubernetes.io/projected/9bd17d59-909c-4109-9063-f7f21106137b-kube-api-access-6mm5z\") pod \"keystone-bootstrap-s7wzj\" (UID: \"9bd17d59-909c-4109-9063-f7f21106137b\") " pod="openstack/keystone-bootstrap-s7wzj" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.250314 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25c1560a-bd40-490a-8d86-a71b9a34b7ea-scripts\") pod \"ceilometer-0\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " pod="openstack/ceilometer-0" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.250338 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25c1560a-bd40-490a-8d86-a71b9a34b7ea-config-data\") pod \"ceilometer-0\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " pod="openstack/ceilometer-0" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.250361 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxhcc\" (UniqueName: \"kubernetes.io/projected/25c1560a-bd40-490a-8d86-a71b9a34b7ea-kube-api-access-kxhcc\") pod \"ceilometer-0\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " pod="openstack/ceilometer-0" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.254727 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-combined-ca-bundle\") pod \"keystone-bootstrap-s7wzj\" (UID: \"9bd17d59-909c-4109-9063-f7f21106137b\") " pod="openstack/keystone-bootstrap-s7wzj" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.255336 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-credential-keys\") pod \"keystone-bootstrap-s7wzj\" (UID: \"9bd17d59-909c-4109-9063-f7f21106137b\") " pod="openstack/keystone-bootstrap-s7wzj" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.257720 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-scripts\") pod \"keystone-bootstrap-s7wzj\" (UID: \"9bd17d59-909c-4109-9063-f7f21106137b\") " pod="openstack/keystone-bootstrap-s7wzj" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.265766 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-config-data\") pod \"keystone-bootstrap-s7wzj\" (UID: \"9bd17d59-909c-4109-9063-f7f21106137b\") " pod="openstack/keystone-bootstrap-s7wzj" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.277868 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mm5z\" (UniqueName: \"kubernetes.io/projected/9bd17d59-909c-4109-9063-f7f21106137b-kube-api-access-6mm5z\") pod \"keystone-bootstrap-s7wzj\" (UID: \"9bd17d59-909c-4109-9063-f7f21106137b\") " pod="openstack/keystone-bootstrap-s7wzj" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.278372 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-fernet-keys\") pod \"keystone-bootstrap-s7wzj\" (UID: \"9bd17d59-909c-4109-9063-f7f21106137b\") " pod="openstack/keystone-bootstrap-s7wzj" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.339758 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7wzj" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.352427 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25c1560a-bd40-490a-8d86-a71b9a34b7ea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " pod="openstack/ceilometer-0" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.352494 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25c1560a-bd40-490a-8d86-a71b9a34b7ea-log-httpd\") pod \"ceilometer-0\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " pod="openstack/ceilometer-0" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.352591 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25c1560a-bd40-490a-8d86-a71b9a34b7ea-scripts\") pod \"ceilometer-0\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " pod="openstack/ceilometer-0" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.352644 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25c1560a-bd40-490a-8d86-a71b9a34b7ea-config-data\") pod \"ceilometer-0\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " pod="openstack/ceilometer-0" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.352669 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxhcc\" (UniqueName: \"kubernetes.io/projected/25c1560a-bd40-490a-8d86-a71b9a34b7ea-kube-api-access-kxhcc\") pod \"ceilometer-0\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " pod="openstack/ceilometer-0" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.352708 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c1560a-bd40-490a-8d86-a71b9a34b7ea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " pod="openstack/ceilometer-0" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.352751 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25c1560a-bd40-490a-8d86-a71b9a34b7ea-run-httpd\") pod \"ceilometer-0\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " pod="openstack/ceilometer-0" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.353257 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25c1560a-bd40-490a-8d86-a71b9a34b7ea-run-httpd\") pod \"ceilometer-0\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " pod="openstack/ceilometer-0" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.353919 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25c1560a-bd40-490a-8d86-a71b9a34b7ea-log-httpd\") pod \"ceilometer-0\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " pod="openstack/ceilometer-0" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.358423 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25c1560a-bd40-490a-8d86-a71b9a34b7ea-scripts\") pod \"ceilometer-0\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " pod="openstack/ceilometer-0" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.363975 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25c1560a-bd40-490a-8d86-a71b9a34b7ea-config-data\") pod \"ceilometer-0\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " pod="openstack/ceilometer-0" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.374299 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c1560a-bd40-490a-8d86-a71b9a34b7ea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " pod="openstack/ceilometer-0" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.374391 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-tcpzp"] Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.375933 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25c1560a-bd40-490a-8d86-a71b9a34b7ea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " pod="openstack/ceilometer-0" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.388313 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxhcc\" (UniqueName: \"kubernetes.io/projected/25c1560a-bd40-490a-8d86-a71b9a34b7ea-kube-api-access-kxhcc\") pod \"ceilometer-0\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " pod="openstack/ceilometer-0" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.399136 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-2qtjv"] Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.400462 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2qtjv" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.406845 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-kfxjq" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.407109 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.407322 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.417205 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-dbh85"] Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.419114 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.428347 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2qtjv"] Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.450917 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-dbh85"] Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.463746 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eb9c560-a9b2-4243-b59a-b40142e48739-logs\") pod \"placement-db-sync-2qtjv\" (UID: \"4eb9c560-a9b2-4243-b59a-b40142e48739\") " pod="openstack/placement-db-sync-2qtjv" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.463850 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-dbh85\" (UID: \"aca1aa1b-b52a-447a-aa0f-771345a441c4\") " pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.463963 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4hqr\" (UniqueName: \"kubernetes.io/projected/aca1aa1b-b52a-447a-aa0f-771345a441c4-kube-api-access-p4hqr\") pod \"dnsmasq-dns-76fcf4b695-dbh85\" (UID: \"aca1aa1b-b52a-447a-aa0f-771345a441c4\") " pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.464020 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eb9c560-a9b2-4243-b59a-b40142e48739-combined-ca-bundle\") pod \"placement-db-sync-2qtjv\" (UID: \"4eb9c560-a9b2-4243-b59a-b40142e48739\") " pod="openstack/placement-db-sync-2qtjv" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.464068 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-dbh85\" (UID: \"aca1aa1b-b52a-447a-aa0f-771345a441c4\") " pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.464096 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-dbh85\" (UID: \"aca1aa1b-b52a-447a-aa0f-771345a441c4\") " pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.464223 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eb9c560-a9b2-4243-b59a-b40142e48739-config-data\") pod \"placement-db-sync-2qtjv\" (UID: \"4eb9c560-a9b2-4243-b59a-b40142e48739\") " pod="openstack/placement-db-sync-2qtjv" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.464299 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmjtd\" (UniqueName: \"kubernetes.io/projected/4eb9c560-a9b2-4243-b59a-b40142e48739-kube-api-access-xmjtd\") pod \"placement-db-sync-2qtjv\" (UID: \"4eb9c560-a9b2-4243-b59a-b40142e48739\") " pod="openstack/placement-db-sync-2qtjv" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.464324 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-config\") pod \"dnsmasq-dns-76fcf4b695-dbh85\" (UID: \"aca1aa1b-b52a-447a-aa0f-771345a441c4\") " pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.464356 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4eb9c560-a9b2-4243-b59a-b40142e48739-scripts\") pod \"placement-db-sync-2qtjv\" (UID: \"4eb9c560-a9b2-4243-b59a-b40142e48739\") " pod="openstack/placement-db-sync-2qtjv" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.464379 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-dbh85\" (UID: \"aca1aa1b-b52a-447a-aa0f-771345a441c4\") " pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.477747 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.577876 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eb9c560-a9b2-4243-b59a-b40142e48739-config-data\") pod \"placement-db-sync-2qtjv\" (UID: \"4eb9c560-a9b2-4243-b59a-b40142e48739\") " pod="openstack/placement-db-sync-2qtjv" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.577934 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmjtd\" (UniqueName: \"kubernetes.io/projected/4eb9c560-a9b2-4243-b59a-b40142e48739-kube-api-access-xmjtd\") pod \"placement-db-sync-2qtjv\" (UID: \"4eb9c560-a9b2-4243-b59a-b40142e48739\") " pod="openstack/placement-db-sync-2qtjv" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.577952 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-config\") pod \"dnsmasq-dns-76fcf4b695-dbh85\" (UID: \"aca1aa1b-b52a-447a-aa0f-771345a441c4\") " pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.577972 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4eb9c560-a9b2-4243-b59a-b40142e48739-scripts\") pod \"placement-db-sync-2qtjv\" (UID: \"4eb9c560-a9b2-4243-b59a-b40142e48739\") " pod="openstack/placement-db-sync-2qtjv" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.578009 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-dbh85\" (UID: \"aca1aa1b-b52a-447a-aa0f-771345a441c4\") " pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.578031 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eb9c560-a9b2-4243-b59a-b40142e48739-logs\") pod \"placement-db-sync-2qtjv\" (UID: \"4eb9c560-a9b2-4243-b59a-b40142e48739\") " pod="openstack/placement-db-sync-2qtjv" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.578058 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-dbh85\" (UID: \"aca1aa1b-b52a-447a-aa0f-771345a441c4\") " pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.578102 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4hqr\" (UniqueName: \"kubernetes.io/projected/aca1aa1b-b52a-447a-aa0f-771345a441c4-kube-api-access-p4hqr\") pod \"dnsmasq-dns-76fcf4b695-dbh85\" (UID: \"aca1aa1b-b52a-447a-aa0f-771345a441c4\") " pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.578123 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eb9c560-a9b2-4243-b59a-b40142e48739-combined-ca-bundle\") pod \"placement-db-sync-2qtjv\" (UID: \"4eb9c560-a9b2-4243-b59a-b40142e48739\") " pod="openstack/placement-db-sync-2qtjv" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.578149 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-dbh85\" (UID: \"aca1aa1b-b52a-447a-aa0f-771345a441c4\") " pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.578168 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-dbh85\" (UID: \"aca1aa1b-b52a-447a-aa0f-771345a441c4\") " pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.578970 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-dbh85\" (UID: \"aca1aa1b-b52a-447a-aa0f-771345a441c4\") " pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.579872 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eb9c560-a9b2-4243-b59a-b40142e48739-logs\") pod \"placement-db-sync-2qtjv\" (UID: \"4eb9c560-a9b2-4243-b59a-b40142e48739\") " pod="openstack/placement-db-sync-2qtjv" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.580768 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-config\") pod \"dnsmasq-dns-76fcf4b695-dbh85\" (UID: \"aca1aa1b-b52a-447a-aa0f-771345a441c4\") " pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.581795 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-dbh85\" (UID: \"aca1aa1b-b52a-447a-aa0f-771345a441c4\") " pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.583982 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-dbh85\" (UID: \"aca1aa1b-b52a-447a-aa0f-771345a441c4\") " pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.584849 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-dbh85\" (UID: \"aca1aa1b-b52a-447a-aa0f-771345a441c4\") " pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.604962 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eb9c560-a9b2-4243-b59a-b40142e48739-combined-ca-bundle\") pod \"placement-db-sync-2qtjv\" (UID: \"4eb9c560-a9b2-4243-b59a-b40142e48739\") " pod="openstack/placement-db-sync-2qtjv" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.609394 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4eb9c560-a9b2-4243-b59a-b40142e48739-scripts\") pod \"placement-db-sync-2qtjv\" (UID: \"4eb9c560-a9b2-4243-b59a-b40142e48739\") " pod="openstack/placement-db-sync-2qtjv" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.609496 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmjtd\" (UniqueName: \"kubernetes.io/projected/4eb9c560-a9b2-4243-b59a-b40142e48739-kube-api-access-xmjtd\") pod \"placement-db-sync-2qtjv\" (UID: \"4eb9c560-a9b2-4243-b59a-b40142e48739\") " pod="openstack/placement-db-sync-2qtjv" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.609614 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4hqr\" (UniqueName: \"kubernetes.io/projected/aca1aa1b-b52a-447a-aa0f-771345a441c4-kube-api-access-p4hqr\") pod \"dnsmasq-dns-76fcf4b695-dbh85\" (UID: \"aca1aa1b-b52a-447a-aa0f-771345a441c4\") " pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.610583 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eb9c560-a9b2-4243-b59a-b40142e48739-config-data\") pod \"placement-db-sync-2qtjv\" (UID: \"4eb9c560-a9b2-4243-b59a-b40142e48739\") " pod="openstack/placement-db-sync-2qtjv" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.751901 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2qtjv" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.805407 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.845740 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-tcpzp"] Oct 01 12:52:57 crc kubenswrapper[4727]: W1001 12:52:57.857827 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9640282e_9dc9_4240_ad29_599e5498278d.slice/crio-e37d5a6da1866e8d9b6ae64b481db09b66fad3a16f353538810e9e270eacb6c2 WatchSource:0}: Error finding container e37d5a6da1866e8d9b6ae64b481db09b66fad3a16f353538810e9e270eacb6c2: Status 404 returned error can't find the container with id e37d5a6da1866e8d9b6ae64b481db09b66fad3a16f353538810e9e270eacb6c2 Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.980696 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-9250-account-create-7h2cw"] Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.981766 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9250-account-create-7h2cw" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.984473 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 01 12:52:57 crc kubenswrapper[4727]: I1001 12:52:57.998432 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9250-account-create-7h2cw"] Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.029537 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-s7wzj"] Oct 01 12:52:58 crc kubenswrapper[4727]: W1001 12:52:58.033669 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bd17d59_909c_4109_9063_f7f21106137b.slice/crio-bd3d848ab9bfabe0b83ec47ae85e6d89d2245cb6d37417ebc8210f9f8ddfb81a WatchSource:0}: Error finding container bd3d848ab9bfabe0b83ec47ae85e6d89d2245cb6d37417ebc8210f9f8ddfb81a: Status 404 returned error can't find the container with id bd3d848ab9bfabe0b83ec47ae85e6d89d2245cb6d37417ebc8210f9f8ddfb81a Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.092510 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-889s7\" (UniqueName: \"kubernetes.io/projected/9819e121-08cf-4bc2-ab90-1560b86b3cd5-kube-api-access-889s7\") pod \"cinder-9250-account-create-7h2cw\" (UID: \"9819e121-08cf-4bc2-ab90-1560b86b3cd5\") " pod="openstack/cinder-9250-account-create-7h2cw" Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.104147 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.194858 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-889s7\" (UniqueName: \"kubernetes.io/projected/9819e121-08cf-4bc2-ab90-1560b86b3cd5-kube-api-access-889s7\") pod \"cinder-9250-account-create-7h2cw\" (UID: \"9819e121-08cf-4bc2-ab90-1560b86b3cd5\") " pod="openstack/cinder-9250-account-create-7h2cw" Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.222628 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-889s7\" (UniqueName: \"kubernetes.io/projected/9819e121-08cf-4bc2-ab90-1560b86b3cd5-kube-api-access-889s7\") pod \"cinder-9250-account-create-7h2cw\" (UID: \"9819e121-08cf-4bc2-ab90-1560b86b3cd5\") " pod="openstack/cinder-9250-account-create-7h2cw" Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.224427 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-22e1-account-create-rxdnh" Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.298067 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7vl5\" (UniqueName: \"kubernetes.io/projected/d7255c8b-01a2-4ab2-82cf-1480602a1083-kube-api-access-g7vl5\") pod \"d7255c8b-01a2-4ab2-82cf-1480602a1083\" (UID: \"d7255c8b-01a2-4ab2-82cf-1480602a1083\") " Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.321535 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7255c8b-01a2-4ab2-82cf-1480602a1083-kube-api-access-g7vl5" (OuterVolumeSpecName: "kube-api-access-g7vl5") pod "d7255c8b-01a2-4ab2-82cf-1480602a1083" (UID: "d7255c8b-01a2-4ab2-82cf-1480602a1083"). InnerVolumeSpecName "kube-api-access-g7vl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.340337 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7vl5\" (UniqueName: \"kubernetes.io/projected/d7255c8b-01a2-4ab2-82cf-1480602a1083-kube-api-access-g7vl5\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.364085 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9250-account-create-7h2cw" Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.385338 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa41b431-40d7-43db-a5a0-d05552cda2d1" path="/var/lib/kubelet/pods/fa41b431-40d7-43db-a5a0-d05552cda2d1/volumes" Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.399760 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2qtjv"] Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.407643 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-dbh85"] Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.443583 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-rq4c7"] Oct 01 12:52:58 crc kubenswrapper[4727]: E1001 12:52:58.443994 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7255c8b-01a2-4ab2-82cf-1480602a1083" containerName="mariadb-account-create" Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.444024 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7255c8b-01a2-4ab2-82cf-1480602a1083" containerName="mariadb-account-create" Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.444211 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7255c8b-01a2-4ab2-82cf-1480602a1083" containerName="mariadb-account-create" Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.444888 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rq4c7" Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.447677 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-8l9cb" Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.450317 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.451315 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-rq4c7"] Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.549821 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4blsk\" (UniqueName: \"kubernetes.io/projected/087bee3f-a34f-43ca-ac4b-b3e46e068898-kube-api-access-4blsk\") pod \"barbican-db-sync-rq4c7\" (UID: \"087bee3f-a34f-43ca-ac4b-b3e46e068898\") " pod="openstack/barbican-db-sync-rq4c7" Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.550533 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/087bee3f-a34f-43ca-ac4b-b3e46e068898-combined-ca-bundle\") pod \"barbican-db-sync-rq4c7\" (UID: \"087bee3f-a34f-43ca-ac4b-b3e46e068898\") " pod="openstack/barbican-db-sync-rq4c7" Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.550682 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/087bee3f-a34f-43ca-ac4b-b3e46e068898-db-sync-config-data\") pod \"barbican-db-sync-rq4c7\" (UID: \"087bee3f-a34f-43ca-ac4b-b3e46e068898\") " pod="openstack/barbican-db-sync-rq4c7" Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.652526 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/087bee3f-a34f-43ca-ac4b-b3e46e068898-combined-ca-bundle\") pod \"barbican-db-sync-rq4c7\" (UID: \"087bee3f-a34f-43ca-ac4b-b3e46e068898\") " pod="openstack/barbican-db-sync-rq4c7" Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.652627 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/087bee3f-a34f-43ca-ac4b-b3e46e068898-db-sync-config-data\") pod \"barbican-db-sync-rq4c7\" (UID: \"087bee3f-a34f-43ca-ac4b-b3e46e068898\") " pod="openstack/barbican-db-sync-rq4c7" Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.653579 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4blsk\" (UniqueName: \"kubernetes.io/projected/087bee3f-a34f-43ca-ac4b-b3e46e068898-kube-api-access-4blsk\") pod \"barbican-db-sync-rq4c7\" (UID: \"087bee3f-a34f-43ca-ac4b-b3e46e068898\") " pod="openstack/barbican-db-sync-rq4c7" Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.658717 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/087bee3f-a34f-43ca-ac4b-b3e46e068898-db-sync-config-data\") pod \"barbican-db-sync-rq4c7\" (UID: \"087bee3f-a34f-43ca-ac4b-b3e46e068898\") " pod="openstack/barbican-db-sync-rq4c7" Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.670560 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/087bee3f-a34f-43ca-ac4b-b3e46e068898-combined-ca-bundle\") pod \"barbican-db-sync-rq4c7\" (UID: \"087bee3f-a34f-43ca-ac4b-b3e46e068898\") " pod="openstack/barbican-db-sync-rq4c7" Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.683952 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4blsk\" (UniqueName: \"kubernetes.io/projected/087bee3f-a34f-43ca-ac4b-b3e46e068898-kube-api-access-4blsk\") pod \"barbican-db-sync-rq4c7\" (UID: \"087bee3f-a34f-43ca-ac4b-b3e46e068898\") " pod="openstack/barbican-db-sync-rq4c7" Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.744065 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7wzj" event={"ID":"9bd17d59-909c-4109-9063-f7f21106137b","Type":"ContainerStarted","Data":"26a4427b3af7f6c00db2becc7c9b767e17e057e4a77167a0cd5c582227845a3c"} Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.744110 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7wzj" event={"ID":"9bd17d59-909c-4109-9063-f7f21106137b","Type":"ContainerStarted","Data":"bd3d848ab9bfabe0b83ec47ae85e6d89d2245cb6d37417ebc8210f9f8ddfb81a"} Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.746986 4727 generic.go:334] "Generic (PLEG): container finished" podID="9640282e-9dc9-4240-ad29-599e5498278d" containerID="8f65a3e5714fd643721817634c41c4147fea649d1a6b1b7e5a74d6665b0b397e" exitCode=0 Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.747041 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-tcpzp" event={"ID":"9640282e-9dc9-4240-ad29-599e5498278d","Type":"ContainerDied","Data":"8f65a3e5714fd643721817634c41c4147fea649d1a6b1b7e5a74d6665b0b397e"} Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.747059 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-tcpzp" event={"ID":"9640282e-9dc9-4240-ad29-599e5498278d","Type":"ContainerStarted","Data":"e37d5a6da1866e8d9b6ae64b481db09b66fad3a16f353538810e9e270eacb6c2"} Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.748563 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2qtjv" event={"ID":"4eb9c560-a9b2-4243-b59a-b40142e48739","Type":"ContainerStarted","Data":"3285cece2bdf68c89e745038a61cc1ac1734b0d0c25e2789b0df2636d5a66ffe"} Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.754138 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25c1560a-bd40-490a-8d86-a71b9a34b7ea","Type":"ContainerStarted","Data":"eec082c363ea6843e1a95bccc3760e56fb185dee9511a8ea8ead0d6605585cfe"} Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.756853 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-22e1-account-create-rxdnh" event={"ID":"d7255c8b-01a2-4ab2-82cf-1480602a1083","Type":"ContainerDied","Data":"839d22cc3b2d863a8ee08f0f1bec94fc1a7c8a9184ce8e609aa0dbfb1e848247"} Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.756884 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="839d22cc3b2d863a8ee08f0f1bec94fc1a7c8a9184ce8e609aa0dbfb1e848247" Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.756966 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-22e1-account-create-rxdnh" Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.762538 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" event={"ID":"aca1aa1b-b52a-447a-aa0f-771345a441c4","Type":"ContainerStarted","Data":"0eef0ba817e3e4b9f43b2c2ee52f9911b07421103e5fb1db0adcc46cc283374c"} Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.762568 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" event={"ID":"aca1aa1b-b52a-447a-aa0f-771345a441c4","Type":"ContainerStarted","Data":"b612d444d2145887f0b1a50a062913b523397dc52876ae3ca1b2f5041984105f"} Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.775753 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rq4c7" Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.800668 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-s7wzj" podStartSLOduration=2.800643494 podStartE2EDuration="2.800643494s" podCreationTimestamp="2025-10-01 12:52:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:52:58.7754405 +0000 UTC m=+957.096795357" watchObservedRunningTime="2025-10-01 12:52:58.800643494 +0000 UTC m=+957.121998341" Oct 01 12:52:58 crc kubenswrapper[4727]: I1001 12:52:58.937054 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9250-account-create-7h2cw"] Oct 01 12:52:58 crc kubenswrapper[4727]: W1001 12:52:58.953041 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9819e121_08cf_4bc2_ab90_1560b86b3cd5.slice/crio-2989d07176ccb7c3e9c341bca947f535a15600753eba59a6a06c8770c84cffe9 WatchSource:0}: Error finding container 2989d07176ccb7c3e9c341bca947f535a15600753eba59a6a06c8770c84cffe9: Status 404 returned error can't find the container with id 2989d07176ccb7c3e9c341bca947f535a15600753eba59a6a06c8770c84cffe9 Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.113355 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-tcpzp" Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.175276 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-config\") pod \"9640282e-9dc9-4240-ad29-599e5498278d\" (UID: \"9640282e-9dc9-4240-ad29-599e5498278d\") " Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.175345 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-ovsdbserver-nb\") pod \"9640282e-9dc9-4240-ad29-599e5498278d\" (UID: \"9640282e-9dc9-4240-ad29-599e5498278d\") " Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.175406 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-dns-swift-storage-0\") pod \"9640282e-9dc9-4240-ad29-599e5498278d\" (UID: \"9640282e-9dc9-4240-ad29-599e5498278d\") " Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.175467 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-ovsdbserver-sb\") pod \"9640282e-9dc9-4240-ad29-599e5498278d\" (UID: \"9640282e-9dc9-4240-ad29-599e5498278d\") " Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.175509 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-dns-svc\") pod \"9640282e-9dc9-4240-ad29-599e5498278d\" (UID: \"9640282e-9dc9-4240-ad29-599e5498278d\") " Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.175585 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrxw9\" (UniqueName: \"kubernetes.io/projected/9640282e-9dc9-4240-ad29-599e5498278d-kube-api-access-rrxw9\") pod \"9640282e-9dc9-4240-ad29-599e5498278d\" (UID: \"9640282e-9dc9-4240-ad29-599e5498278d\") " Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.182250 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9640282e-9dc9-4240-ad29-599e5498278d-kube-api-access-rrxw9" (OuterVolumeSpecName: "kube-api-access-rrxw9") pod "9640282e-9dc9-4240-ad29-599e5498278d" (UID: "9640282e-9dc9-4240-ad29-599e5498278d"). InnerVolumeSpecName "kube-api-access-rrxw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.220559 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9640282e-9dc9-4240-ad29-599e5498278d" (UID: "9640282e-9dc9-4240-ad29-599e5498278d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.225166 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9640282e-9dc9-4240-ad29-599e5498278d" (UID: "9640282e-9dc9-4240-ad29-599e5498278d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.229939 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-config" (OuterVolumeSpecName: "config") pod "9640282e-9dc9-4240-ad29-599e5498278d" (UID: "9640282e-9dc9-4240-ad29-599e5498278d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.233987 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9640282e-9dc9-4240-ad29-599e5498278d" (UID: "9640282e-9dc9-4240-ad29-599e5498278d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.236841 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9640282e-9dc9-4240-ad29-599e5498278d" (UID: "9640282e-9dc9-4240-ad29-599e5498278d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.272643 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.277165 4727 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.277202 4727 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.277212 4727 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.277222 4727 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.277232 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrxw9\" (UniqueName: \"kubernetes.io/projected/9640282e-9dc9-4240-ad29-599e5498278d-kube-api-access-rrxw9\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.277241 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9640282e-9dc9-4240-ad29-599e5498278d-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.375815 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-rq4c7"] Oct 01 12:52:59 crc kubenswrapper[4727]: W1001 12:52:59.389842 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod087bee3f_a34f_43ca_ac4b_b3e46e068898.slice/crio-0eedd0e3bdbed97a0116bc3e467603f85a3ef30bcb6f55078aca79eba54d2a10 WatchSource:0}: Error finding container 0eedd0e3bdbed97a0116bc3e467603f85a3ef30bcb6f55078aca79eba54d2a10: Status 404 returned error can't find the container with id 0eedd0e3bdbed97a0116bc3e467603f85a3ef30bcb6f55078aca79eba54d2a10 Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.772239 4727 generic.go:334] "Generic (PLEG): container finished" podID="9819e121-08cf-4bc2-ab90-1560b86b3cd5" containerID="503e5b2d580b683cfb5a9fefc83eb21a7b99d8005b38d58261ba09724dce2424" exitCode=0 Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.772323 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9250-account-create-7h2cw" event={"ID":"9819e121-08cf-4bc2-ab90-1560b86b3cd5","Type":"ContainerDied","Data":"503e5b2d580b683cfb5a9fefc83eb21a7b99d8005b38d58261ba09724dce2424"} Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.772355 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9250-account-create-7h2cw" event={"ID":"9819e121-08cf-4bc2-ab90-1560b86b3cd5","Type":"ContainerStarted","Data":"2989d07176ccb7c3e9c341bca947f535a15600753eba59a6a06c8770c84cffe9"} Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.774055 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rq4c7" event={"ID":"087bee3f-a34f-43ca-ac4b-b3e46e068898","Type":"ContainerStarted","Data":"0eedd0e3bdbed97a0116bc3e467603f85a3ef30bcb6f55078aca79eba54d2a10"} Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.775483 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-tcpzp" event={"ID":"9640282e-9dc9-4240-ad29-599e5498278d","Type":"ContainerDied","Data":"e37d5a6da1866e8d9b6ae64b481db09b66fad3a16f353538810e9e270eacb6c2"} Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.775522 4727 scope.go:117] "RemoveContainer" containerID="8f65a3e5714fd643721817634c41c4147fea649d1a6b1b7e5a74d6665b0b397e" Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.775636 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-tcpzp" Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.786871 4727 generic.go:334] "Generic (PLEG): container finished" podID="aca1aa1b-b52a-447a-aa0f-771345a441c4" containerID="0eef0ba817e3e4b9f43b2c2ee52f9911b07421103e5fb1db0adcc46cc283374c" exitCode=0 Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.787960 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" event={"ID":"aca1aa1b-b52a-447a-aa0f-771345a441c4","Type":"ContainerDied","Data":"0eef0ba817e3e4b9f43b2c2ee52f9911b07421103e5fb1db0adcc46cc283374c"} Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.889120 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-tcpzp"] Oct 01 12:52:59 crc kubenswrapper[4727]: I1001 12:52:59.897203 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-tcpzp"] Oct 01 12:53:00 crc kubenswrapper[4727]: I1001 12:53:00.382767 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9640282e-9dc9-4240-ad29-599e5498278d" path="/var/lib/kubelet/pods/9640282e-9dc9-4240-ad29-599e5498278d/volumes" Oct 01 12:53:00 crc kubenswrapper[4727]: I1001 12:53:00.799553 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" event={"ID":"aca1aa1b-b52a-447a-aa0f-771345a441c4","Type":"ContainerStarted","Data":"af08112beb9a78d13612f378533eb4aa62d825792604cb0de7900927201b3d10"} Oct 01 12:53:00 crc kubenswrapper[4727]: I1001 12:53:00.799636 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" Oct 01 12:53:00 crc kubenswrapper[4727]: I1001 12:53:00.825822 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" podStartSLOduration=3.825801233 podStartE2EDuration="3.825801233s" podCreationTimestamp="2025-10-01 12:52:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:53:00.819137493 +0000 UTC m=+959.140492340" watchObservedRunningTime="2025-10-01 12:53:00.825801233 +0000 UTC m=+959.147156080" Oct 01 12:53:02 crc kubenswrapper[4727]: I1001 12:53:02.817864 4727 generic.go:334] "Generic (PLEG): container finished" podID="9bd17d59-909c-4109-9063-f7f21106137b" containerID="26a4427b3af7f6c00db2becc7c9b767e17e057e4a77167a0cd5c582227845a3c" exitCode=0 Oct 01 12:53:02 crc kubenswrapper[4727]: I1001 12:53:02.817928 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7wzj" event={"ID":"9bd17d59-909c-4109-9063-f7f21106137b","Type":"ContainerDied","Data":"26a4427b3af7f6c00db2becc7c9b767e17e057e4a77167a0cd5c582227845a3c"} Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.034590 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9250-account-create-7h2cw" Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.151812 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-889s7\" (UniqueName: \"kubernetes.io/projected/9819e121-08cf-4bc2-ab90-1560b86b3cd5-kube-api-access-889s7\") pod \"9819e121-08cf-4bc2-ab90-1560b86b3cd5\" (UID: \"9819e121-08cf-4bc2-ab90-1560b86b3cd5\") " Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.158650 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9819e121-08cf-4bc2-ab90-1560b86b3cd5-kube-api-access-889s7" (OuterVolumeSpecName: "kube-api-access-889s7") pod "9819e121-08cf-4bc2-ab90-1560b86b3cd5" (UID: "9819e121-08cf-4bc2-ab90-1560b86b3cd5"). InnerVolumeSpecName "kube-api-access-889s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.253990 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-889s7\" (UniqueName: \"kubernetes.io/projected/9819e121-08cf-4bc2-ab90-1560b86b3cd5-kube-api-access-889s7\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.292466 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.292530 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.292580 4727 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.293334 4727 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8ee5ee2e5696638af5bc213bd13dc53b7b85703a971ba03bb8cf933270c1945e"} pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.293397 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" containerID="cri-o://8ee5ee2e5696638af5bc213bd13dc53b7b85703a971ba03bb8cf933270c1945e" gracePeriod=600 Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.671836 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-nkrx8"] Oct 01 12:53:03 crc kubenswrapper[4727]: E1001 12:53:03.672181 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9819e121-08cf-4bc2-ab90-1560b86b3cd5" containerName="mariadb-account-create" Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.672196 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="9819e121-08cf-4bc2-ab90-1560b86b3cd5" containerName="mariadb-account-create" Oct 01 12:53:03 crc kubenswrapper[4727]: E1001 12:53:03.672209 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9640282e-9dc9-4240-ad29-599e5498278d" containerName="init" Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.672214 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="9640282e-9dc9-4240-ad29-599e5498278d" containerName="init" Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.672402 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="9640282e-9dc9-4240-ad29-599e5498278d" containerName="init" Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.672419 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="9819e121-08cf-4bc2-ab90-1560b86b3cd5" containerName="mariadb-account-create" Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.672905 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nkrx8" Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.674917 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.675064 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-zjn49" Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.678948 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.692902 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-nkrx8"] Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.769383 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d4da191c-6509-4bb7-b9b2-344f8224ae58-config\") pod \"neutron-db-sync-nkrx8\" (UID: \"d4da191c-6509-4bb7-b9b2-344f8224ae58\") " pod="openstack/neutron-db-sync-nkrx8" Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.769741 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltttk\" (UniqueName: \"kubernetes.io/projected/d4da191c-6509-4bb7-b9b2-344f8224ae58-kube-api-access-ltttk\") pod \"neutron-db-sync-nkrx8\" (UID: \"d4da191c-6509-4bb7-b9b2-344f8224ae58\") " pod="openstack/neutron-db-sync-nkrx8" Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.769864 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4da191c-6509-4bb7-b9b2-344f8224ae58-combined-ca-bundle\") pod \"neutron-db-sync-nkrx8\" (UID: \"d4da191c-6509-4bb7-b9b2-344f8224ae58\") " pod="openstack/neutron-db-sync-nkrx8" Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.830874 4727 generic.go:334] "Generic (PLEG): container finished" podID="d18290ae-64a5-44a5-a704-90977d85852b" containerID="8ee5ee2e5696638af5bc213bd13dc53b7b85703a971ba03bb8cf933270c1945e" exitCode=0 Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.830955 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" event={"ID":"d18290ae-64a5-44a5-a704-90977d85852b","Type":"ContainerDied","Data":"8ee5ee2e5696638af5bc213bd13dc53b7b85703a971ba03bb8cf933270c1945e"} Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.831074 4727 scope.go:117] "RemoveContainer" containerID="5a5c4ca99360c9b81c10e0ced10d126f629e2db295e44de92257033d1fe6295f" Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.834064 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9250-account-create-7h2cw" event={"ID":"9819e121-08cf-4bc2-ab90-1560b86b3cd5","Type":"ContainerDied","Data":"2989d07176ccb7c3e9c341bca947f535a15600753eba59a6a06c8770c84cffe9"} Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.834102 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2989d07176ccb7c3e9c341bca947f535a15600753eba59a6a06c8770c84cffe9" Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.834130 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9250-account-create-7h2cw" Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.872783 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltttk\" (UniqueName: \"kubernetes.io/projected/d4da191c-6509-4bb7-b9b2-344f8224ae58-kube-api-access-ltttk\") pod \"neutron-db-sync-nkrx8\" (UID: \"d4da191c-6509-4bb7-b9b2-344f8224ae58\") " pod="openstack/neutron-db-sync-nkrx8" Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.872899 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4da191c-6509-4bb7-b9b2-344f8224ae58-combined-ca-bundle\") pod \"neutron-db-sync-nkrx8\" (UID: \"d4da191c-6509-4bb7-b9b2-344f8224ae58\") " pod="openstack/neutron-db-sync-nkrx8" Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.872943 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d4da191c-6509-4bb7-b9b2-344f8224ae58-config\") pod \"neutron-db-sync-nkrx8\" (UID: \"d4da191c-6509-4bb7-b9b2-344f8224ae58\") " pod="openstack/neutron-db-sync-nkrx8" Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.878163 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d4da191c-6509-4bb7-b9b2-344f8224ae58-config\") pod \"neutron-db-sync-nkrx8\" (UID: \"d4da191c-6509-4bb7-b9b2-344f8224ae58\") " pod="openstack/neutron-db-sync-nkrx8" Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.878373 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4da191c-6509-4bb7-b9b2-344f8224ae58-combined-ca-bundle\") pod \"neutron-db-sync-nkrx8\" (UID: \"d4da191c-6509-4bb7-b9b2-344f8224ae58\") " pod="openstack/neutron-db-sync-nkrx8" Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.892760 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltttk\" (UniqueName: \"kubernetes.io/projected/d4da191c-6509-4bb7-b9b2-344f8224ae58-kube-api-access-ltttk\") pod \"neutron-db-sync-nkrx8\" (UID: \"d4da191c-6509-4bb7-b9b2-344f8224ae58\") " pod="openstack/neutron-db-sync-nkrx8" Oct 01 12:53:03 crc kubenswrapper[4727]: I1001 12:53:03.998195 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nkrx8" Oct 01 12:53:04 crc kubenswrapper[4727]: I1001 12:53:04.844144 4727 generic.go:334] "Generic (PLEG): container finished" podID="5cf7db27-ff87-481e-a776-cb171e57f4b9" containerID="d01dbe5e7d3fa66afa6e2d63083986a5682b9751e62e8d9115fb1e5fe0317a27" exitCode=0 Oct 01 12:53:04 crc kubenswrapper[4727]: I1001 12:53:04.844211 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w52kz" event={"ID":"5cf7db27-ff87-481e-a776-cb171e57f4b9","Type":"ContainerDied","Data":"d01dbe5e7d3fa66afa6e2d63083986a5682b9751e62e8d9115fb1e5fe0317a27"} Oct 01 12:53:05 crc kubenswrapper[4727]: I1001 12:53:05.278501 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7wzj" Oct 01 12:53:05 crc kubenswrapper[4727]: I1001 12:53:05.397262 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-config-data\") pod \"9bd17d59-909c-4109-9063-f7f21106137b\" (UID: \"9bd17d59-909c-4109-9063-f7f21106137b\") " Oct 01 12:53:05 crc kubenswrapper[4727]: I1001 12:53:05.397770 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mm5z\" (UniqueName: \"kubernetes.io/projected/9bd17d59-909c-4109-9063-f7f21106137b-kube-api-access-6mm5z\") pod \"9bd17d59-909c-4109-9063-f7f21106137b\" (UID: \"9bd17d59-909c-4109-9063-f7f21106137b\") " Oct 01 12:53:05 crc kubenswrapper[4727]: I1001 12:53:05.397819 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-credential-keys\") pod \"9bd17d59-909c-4109-9063-f7f21106137b\" (UID: \"9bd17d59-909c-4109-9063-f7f21106137b\") " Oct 01 12:53:05 crc kubenswrapper[4727]: I1001 12:53:05.397854 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-scripts\") pod \"9bd17d59-909c-4109-9063-f7f21106137b\" (UID: \"9bd17d59-909c-4109-9063-f7f21106137b\") " Oct 01 12:53:05 crc kubenswrapper[4727]: I1001 12:53:05.397930 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-combined-ca-bundle\") pod \"9bd17d59-909c-4109-9063-f7f21106137b\" (UID: \"9bd17d59-909c-4109-9063-f7f21106137b\") " Oct 01 12:53:05 crc kubenswrapper[4727]: I1001 12:53:05.397972 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-fernet-keys\") pod \"9bd17d59-909c-4109-9063-f7f21106137b\" (UID: \"9bd17d59-909c-4109-9063-f7f21106137b\") " Oct 01 12:53:05 crc kubenswrapper[4727]: I1001 12:53:05.403407 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9bd17d59-909c-4109-9063-f7f21106137b" (UID: "9bd17d59-909c-4109-9063-f7f21106137b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:05 crc kubenswrapper[4727]: I1001 12:53:05.403778 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-scripts" (OuterVolumeSpecName: "scripts") pod "9bd17d59-909c-4109-9063-f7f21106137b" (UID: "9bd17d59-909c-4109-9063-f7f21106137b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:05 crc kubenswrapper[4727]: I1001 12:53:05.404374 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9bd17d59-909c-4109-9063-f7f21106137b" (UID: "9bd17d59-909c-4109-9063-f7f21106137b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:05 crc kubenswrapper[4727]: I1001 12:53:05.405854 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bd17d59-909c-4109-9063-f7f21106137b-kube-api-access-6mm5z" (OuterVolumeSpecName: "kube-api-access-6mm5z") pod "9bd17d59-909c-4109-9063-f7f21106137b" (UID: "9bd17d59-909c-4109-9063-f7f21106137b"). InnerVolumeSpecName "kube-api-access-6mm5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:05 crc kubenswrapper[4727]: I1001 12:53:05.425340 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-config-data" (OuterVolumeSpecName: "config-data") pod "9bd17d59-909c-4109-9063-f7f21106137b" (UID: "9bd17d59-909c-4109-9063-f7f21106137b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:05 crc kubenswrapper[4727]: I1001 12:53:05.426958 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9bd17d59-909c-4109-9063-f7f21106137b" (UID: "9bd17d59-909c-4109-9063-f7f21106137b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:05 crc kubenswrapper[4727]: I1001 12:53:05.499690 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mm5z\" (UniqueName: \"kubernetes.io/projected/9bd17d59-909c-4109-9063-f7f21106137b-kube-api-access-6mm5z\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:05 crc kubenswrapper[4727]: I1001 12:53:05.499729 4727 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:05 crc kubenswrapper[4727]: I1001 12:53:05.499739 4727 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:05 crc kubenswrapper[4727]: I1001 12:53:05.499749 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:05 crc kubenswrapper[4727]: I1001 12:53:05.499759 4727 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:05 crc kubenswrapper[4727]: I1001 12:53:05.499768 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bd17d59-909c-4109-9063-f7f21106137b-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:05 crc kubenswrapper[4727]: I1001 12:53:05.854053 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-s7wzj" Oct 01 12:53:05 crc kubenswrapper[4727]: I1001 12:53:05.857287 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-s7wzj" event={"ID":"9bd17d59-909c-4109-9063-f7f21106137b","Type":"ContainerDied","Data":"bd3d848ab9bfabe0b83ec47ae85e6d89d2245cb6d37417ebc8210f9f8ddfb81a"} Oct 01 12:53:05 crc kubenswrapper[4727]: I1001 12:53:05.857559 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd3d848ab9bfabe0b83ec47ae85e6d89d2245cb6d37417ebc8210f9f8ddfb81a" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.357802 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-s7wzj"] Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.366762 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-s7wzj"] Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.382291 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bd17d59-909c-4109-9063-f7f21106137b" path="/var/lib/kubelet/pods/9bd17d59-909c-4109-9063-f7f21106137b/volumes" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.457487 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-czk9d"] Oct 01 12:53:06 crc kubenswrapper[4727]: E1001 12:53:06.457828 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bd17d59-909c-4109-9063-f7f21106137b" containerName="keystone-bootstrap" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.457846 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd17d59-909c-4109-9063-f7f21106137b" containerName="keystone-bootstrap" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.458044 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bd17d59-909c-4109-9063-f7f21106137b" containerName="keystone-bootstrap" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.458695 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-czk9d" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.461308 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.461525 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.461663 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jxtdc" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.464089 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.468329 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-czk9d"] Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.521181 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8plb\" (UniqueName: \"kubernetes.io/projected/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-kube-api-access-x8plb\") pod \"keystone-bootstrap-czk9d\" (UID: \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\") " pod="openstack/keystone-bootstrap-czk9d" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.521293 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-credential-keys\") pod \"keystone-bootstrap-czk9d\" (UID: \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\") " pod="openstack/keystone-bootstrap-czk9d" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.521368 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-config-data\") pod \"keystone-bootstrap-czk9d\" (UID: \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\") " pod="openstack/keystone-bootstrap-czk9d" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.521423 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-combined-ca-bundle\") pod \"keystone-bootstrap-czk9d\" (UID: \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\") " pod="openstack/keystone-bootstrap-czk9d" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.521463 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-fernet-keys\") pod \"keystone-bootstrap-czk9d\" (UID: \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\") " pod="openstack/keystone-bootstrap-czk9d" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.521491 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-scripts\") pod \"keystone-bootstrap-czk9d\" (UID: \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\") " pod="openstack/keystone-bootstrap-czk9d" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.623930 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8plb\" (UniqueName: \"kubernetes.io/projected/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-kube-api-access-x8plb\") pod \"keystone-bootstrap-czk9d\" (UID: \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\") " pod="openstack/keystone-bootstrap-czk9d" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.624035 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-credential-keys\") pod \"keystone-bootstrap-czk9d\" (UID: \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\") " pod="openstack/keystone-bootstrap-czk9d" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.624069 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-config-data\") pod \"keystone-bootstrap-czk9d\" (UID: \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\") " pod="openstack/keystone-bootstrap-czk9d" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.624093 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-combined-ca-bundle\") pod \"keystone-bootstrap-czk9d\" (UID: \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\") " pod="openstack/keystone-bootstrap-czk9d" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.624124 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-fernet-keys\") pod \"keystone-bootstrap-czk9d\" (UID: \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\") " pod="openstack/keystone-bootstrap-czk9d" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.624148 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-scripts\") pod \"keystone-bootstrap-czk9d\" (UID: \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\") " pod="openstack/keystone-bootstrap-czk9d" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.629673 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-scripts\") pod \"keystone-bootstrap-czk9d\" (UID: \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\") " pod="openstack/keystone-bootstrap-czk9d" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.630296 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-fernet-keys\") pod \"keystone-bootstrap-czk9d\" (UID: \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\") " pod="openstack/keystone-bootstrap-czk9d" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.630653 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-credential-keys\") pod \"keystone-bootstrap-czk9d\" (UID: \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\") " pod="openstack/keystone-bootstrap-czk9d" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.631264 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-config-data\") pod \"keystone-bootstrap-czk9d\" (UID: \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\") " pod="openstack/keystone-bootstrap-czk9d" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.638870 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-combined-ca-bundle\") pod \"keystone-bootstrap-czk9d\" (UID: \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\") " pod="openstack/keystone-bootstrap-czk9d" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.639221 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8plb\" (UniqueName: \"kubernetes.io/projected/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-kube-api-access-x8plb\") pod \"keystone-bootstrap-czk9d\" (UID: \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\") " pod="openstack/keystone-bootstrap-czk9d" Oct 01 12:53:06 crc kubenswrapper[4727]: I1001 12:53:06.784611 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-czk9d" Oct 01 12:53:07 crc kubenswrapper[4727]: I1001 12:53:07.228452 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w52kz" Oct 01 12:53:07 crc kubenswrapper[4727]: I1001 12:53:07.333950 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5cf7db27-ff87-481e-a776-cb171e57f4b9-db-sync-config-data\") pod \"5cf7db27-ff87-481e-a776-cb171e57f4b9\" (UID: \"5cf7db27-ff87-481e-a776-cb171e57f4b9\") " Oct 01 12:53:07 crc kubenswrapper[4727]: I1001 12:53:07.334081 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cf7db27-ff87-481e-a776-cb171e57f4b9-config-data\") pod \"5cf7db27-ff87-481e-a776-cb171e57f4b9\" (UID: \"5cf7db27-ff87-481e-a776-cb171e57f4b9\") " Oct 01 12:53:07 crc kubenswrapper[4727]: I1001 12:53:07.334143 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cf7db27-ff87-481e-a776-cb171e57f4b9-combined-ca-bundle\") pod \"5cf7db27-ff87-481e-a776-cb171e57f4b9\" (UID: \"5cf7db27-ff87-481e-a776-cb171e57f4b9\") " Oct 01 12:53:07 crc kubenswrapper[4727]: I1001 12:53:07.334168 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55gfl\" (UniqueName: \"kubernetes.io/projected/5cf7db27-ff87-481e-a776-cb171e57f4b9-kube-api-access-55gfl\") pod \"5cf7db27-ff87-481e-a776-cb171e57f4b9\" (UID: \"5cf7db27-ff87-481e-a776-cb171e57f4b9\") " Oct 01 12:53:07 crc kubenswrapper[4727]: I1001 12:53:07.338344 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cf7db27-ff87-481e-a776-cb171e57f4b9-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5cf7db27-ff87-481e-a776-cb171e57f4b9" (UID: "5cf7db27-ff87-481e-a776-cb171e57f4b9"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:07 crc kubenswrapper[4727]: I1001 12:53:07.338949 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cf7db27-ff87-481e-a776-cb171e57f4b9-kube-api-access-55gfl" (OuterVolumeSpecName: "kube-api-access-55gfl") pod "5cf7db27-ff87-481e-a776-cb171e57f4b9" (UID: "5cf7db27-ff87-481e-a776-cb171e57f4b9"). InnerVolumeSpecName "kube-api-access-55gfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:07 crc kubenswrapper[4727]: I1001 12:53:07.369021 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cf7db27-ff87-481e-a776-cb171e57f4b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5cf7db27-ff87-481e-a776-cb171e57f4b9" (UID: "5cf7db27-ff87-481e-a776-cb171e57f4b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:07 crc kubenswrapper[4727]: I1001 12:53:07.383422 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cf7db27-ff87-481e-a776-cb171e57f4b9-config-data" (OuterVolumeSpecName: "config-data") pod "5cf7db27-ff87-481e-a776-cb171e57f4b9" (UID: "5cf7db27-ff87-481e-a776-cb171e57f4b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:07 crc kubenswrapper[4727]: I1001 12:53:07.436273 4727 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5cf7db27-ff87-481e-a776-cb171e57f4b9-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:07 crc kubenswrapper[4727]: I1001 12:53:07.436320 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cf7db27-ff87-481e-a776-cb171e57f4b9-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:07 crc kubenswrapper[4727]: I1001 12:53:07.436337 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cf7db27-ff87-481e-a776-cb171e57f4b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:07 crc kubenswrapper[4727]: I1001 12:53:07.436346 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55gfl\" (UniqueName: \"kubernetes.io/projected/5cf7db27-ff87-481e-a776-cb171e57f4b9-kube-api-access-55gfl\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:07 crc kubenswrapper[4727]: I1001 12:53:07.807173 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" Oct 01 12:53:07 crc kubenswrapper[4727]: I1001 12:53:07.886401 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-x2drr"] Oct 01 12:53:07 crc kubenswrapper[4727]: I1001 12:53:07.886736 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" podUID="02e320cc-d38f-4f37-9353-395b869f907e" containerName="dnsmasq-dns" containerID="cri-o://d399556aa9b4c4b2c56dbe148a07ce603cd692ca35c4e36ff9c74d71683c1ab9" gracePeriod=10 Oct 01 12:53:07 crc kubenswrapper[4727]: I1001 12:53:07.894214 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w52kz" event={"ID":"5cf7db27-ff87-481e-a776-cb171e57f4b9","Type":"ContainerDied","Data":"5e62e258658704d926581be80de1e6988565f788ecc5485eb544031f8b66578f"} Oct 01 12:53:07 crc kubenswrapper[4727]: I1001 12:53:07.894250 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e62e258658704d926581be80de1e6988565f788ecc5485eb544031f8b66578f" Oct 01 12:53:07 crc kubenswrapper[4727]: I1001 12:53:07.894385 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w52kz" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.125561 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-nkrx8"] Oct 01 12:53:08 crc kubenswrapper[4727]: W1001 12:53:08.129276 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4da191c_6509_4bb7_b9b2_344f8224ae58.slice/crio-626614f1e1533b93ced26cb42f32d8b94e6fec134921fca558ff02056e628089 WatchSource:0}: Error finding container 626614f1e1533b93ced26cb42f32d8b94e6fec134921fca558ff02056e628089: Status 404 returned error can't find the container with id 626614f1e1533b93ced26cb42f32d8b94e6fec134921fca558ff02056e628089 Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.226155 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-74wc5"] Oct 01 12:53:08 crc kubenswrapper[4727]: E1001 12:53:08.226534 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cf7db27-ff87-481e-a776-cb171e57f4b9" containerName="glance-db-sync" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.226550 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cf7db27-ff87-481e-a776-cb171e57f4b9" containerName="glance-db-sync" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.226767 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cf7db27-ff87-481e-a776-cb171e57f4b9" containerName="glance-db-sync" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.227410 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-74wc5" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.230560 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.232595 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.233198 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xkl2d" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.244032 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-74wc5"] Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.256515 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5746629a-ce5e-4404-8996-165034633b9e-config-data\") pod \"cinder-db-sync-74wc5\" (UID: \"5746629a-ce5e-4404-8996-165034633b9e\") " pod="openstack/cinder-db-sync-74wc5" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.256570 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l22hk\" (UniqueName: \"kubernetes.io/projected/5746629a-ce5e-4404-8996-165034633b9e-kube-api-access-l22hk\") pod \"cinder-db-sync-74wc5\" (UID: \"5746629a-ce5e-4404-8996-165034633b9e\") " pod="openstack/cinder-db-sync-74wc5" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.256610 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5746629a-ce5e-4404-8996-165034633b9e-combined-ca-bundle\") pod \"cinder-db-sync-74wc5\" (UID: \"5746629a-ce5e-4404-8996-165034633b9e\") " pod="openstack/cinder-db-sync-74wc5" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.256641 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5746629a-ce5e-4404-8996-165034633b9e-db-sync-config-data\") pod \"cinder-db-sync-74wc5\" (UID: \"5746629a-ce5e-4404-8996-165034633b9e\") " pod="openstack/cinder-db-sync-74wc5" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.256701 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5746629a-ce5e-4404-8996-165034633b9e-etc-machine-id\") pod \"cinder-db-sync-74wc5\" (UID: \"5746629a-ce5e-4404-8996-165034633b9e\") " pod="openstack/cinder-db-sync-74wc5" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.256733 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5746629a-ce5e-4404-8996-165034633b9e-scripts\") pod \"cinder-db-sync-74wc5\" (UID: \"5746629a-ce5e-4404-8996-165034633b9e\") " pod="openstack/cinder-db-sync-74wc5" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.302055 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-czk9d"] Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.358696 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5746629a-ce5e-4404-8996-165034633b9e-combined-ca-bundle\") pod \"cinder-db-sync-74wc5\" (UID: \"5746629a-ce5e-4404-8996-165034633b9e\") " pod="openstack/cinder-db-sync-74wc5" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.358922 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5746629a-ce5e-4404-8996-165034633b9e-db-sync-config-data\") pod \"cinder-db-sync-74wc5\" (UID: \"5746629a-ce5e-4404-8996-165034633b9e\") " pod="openstack/cinder-db-sync-74wc5" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.358980 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5746629a-ce5e-4404-8996-165034633b9e-etc-machine-id\") pod \"cinder-db-sync-74wc5\" (UID: \"5746629a-ce5e-4404-8996-165034633b9e\") " pod="openstack/cinder-db-sync-74wc5" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.359027 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5746629a-ce5e-4404-8996-165034633b9e-scripts\") pod \"cinder-db-sync-74wc5\" (UID: \"5746629a-ce5e-4404-8996-165034633b9e\") " pod="openstack/cinder-db-sync-74wc5" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.359056 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5746629a-ce5e-4404-8996-165034633b9e-config-data\") pod \"cinder-db-sync-74wc5\" (UID: \"5746629a-ce5e-4404-8996-165034633b9e\") " pod="openstack/cinder-db-sync-74wc5" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.359082 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l22hk\" (UniqueName: \"kubernetes.io/projected/5746629a-ce5e-4404-8996-165034633b9e-kube-api-access-l22hk\") pod \"cinder-db-sync-74wc5\" (UID: \"5746629a-ce5e-4404-8996-165034633b9e\") " pod="openstack/cinder-db-sync-74wc5" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.359808 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5746629a-ce5e-4404-8996-165034633b9e-etc-machine-id\") pod \"cinder-db-sync-74wc5\" (UID: \"5746629a-ce5e-4404-8996-165034633b9e\") " pod="openstack/cinder-db-sync-74wc5" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.366434 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5746629a-ce5e-4404-8996-165034633b9e-db-sync-config-data\") pod \"cinder-db-sync-74wc5\" (UID: \"5746629a-ce5e-4404-8996-165034633b9e\") " pod="openstack/cinder-db-sync-74wc5" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.373020 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5746629a-ce5e-4404-8996-165034633b9e-combined-ca-bundle\") pod \"cinder-db-sync-74wc5\" (UID: \"5746629a-ce5e-4404-8996-165034633b9e\") " pod="openstack/cinder-db-sync-74wc5" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.373097 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5746629a-ce5e-4404-8996-165034633b9e-scripts\") pod \"cinder-db-sync-74wc5\" (UID: \"5746629a-ce5e-4404-8996-165034633b9e\") " pod="openstack/cinder-db-sync-74wc5" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.375352 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5746629a-ce5e-4404-8996-165034633b9e-config-data\") pod \"cinder-db-sync-74wc5\" (UID: \"5746629a-ce5e-4404-8996-165034633b9e\") " pod="openstack/cinder-db-sync-74wc5" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.381826 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l22hk\" (UniqueName: \"kubernetes.io/projected/5746629a-ce5e-4404-8996-165034633b9e-kube-api-access-l22hk\") pod \"cinder-db-sync-74wc5\" (UID: \"5746629a-ce5e-4404-8996-165034633b9e\") " pod="openstack/cinder-db-sync-74wc5" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.551657 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-74wc5" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.607821 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-4jv4q"] Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.618631 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.630847 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-4jv4q"] Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.664729 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5766l\" (UniqueName: \"kubernetes.io/projected/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-kube-api-access-5766l\") pod \"dnsmasq-dns-8b5c85b87-4jv4q\" (UID: \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\") " pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.664767 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-4jv4q\" (UID: \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\") " pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.664814 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-4jv4q\" (UID: \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\") " pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.664844 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-config\") pod \"dnsmasq-dns-8b5c85b87-4jv4q\" (UID: \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\") " pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.664888 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-4jv4q\" (UID: \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\") " pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.664913 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-4jv4q\" (UID: \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\") " pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.773241 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-4jv4q\" (UID: \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\") " pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.773404 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-4jv4q\" (UID: \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\") " pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.773532 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5766l\" (UniqueName: \"kubernetes.io/projected/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-kube-api-access-5766l\") pod \"dnsmasq-dns-8b5c85b87-4jv4q\" (UID: \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\") " pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.773592 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-4jv4q\" (UID: \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\") " pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.773679 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-4jv4q\" (UID: \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\") " pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.773754 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-config\") pod \"dnsmasq-dns-8b5c85b87-4jv4q\" (UID: \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\") " pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.774700 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-config\") pod \"dnsmasq-dns-8b5c85b87-4jv4q\" (UID: \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\") " pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.774919 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-4jv4q\" (UID: \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\") " pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.775052 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-4jv4q\" (UID: \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\") " pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.775525 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-4jv4q\" (UID: \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\") " pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.775722 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-4jv4q\" (UID: \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\") " pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.815957 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5766l\" (UniqueName: \"kubernetes.io/projected/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-kube-api-access-5766l\") pod \"dnsmasq-dns-8b5c85b87-4jv4q\" (UID: \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\") " pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.912255 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nkrx8" event={"ID":"d4da191c-6509-4bb7-b9b2-344f8224ae58","Type":"ContainerStarted","Data":"9df9fca93de18a593d6c01b4c56a4f0c1ab4ae6be3d3bb88e43ece750ad4d56c"} Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.912318 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nkrx8" event={"ID":"d4da191c-6509-4bb7-b9b2-344f8224ae58","Type":"ContainerStarted","Data":"626614f1e1533b93ced26cb42f32d8b94e6fec134921fca558ff02056e628089"} Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.914850 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rq4c7" event={"ID":"087bee3f-a34f-43ca-ac4b-b3e46e068898","Type":"ContainerStarted","Data":"60bc5b53364f3a052513adb74535bdb4845204bbd4e01a6a62500eeac82e9b0b"} Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.920448 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2qtjv" event={"ID":"4eb9c560-a9b2-4243-b59a-b40142e48739","Type":"ContainerStarted","Data":"5972f220b28b5bac03f6c034fe2fbae406896dcbdd6967f9eb5538b6ffdc638f"} Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.922145 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25c1560a-bd40-490a-8d86-a71b9a34b7ea","Type":"ContainerStarted","Data":"28c31b69b1f606cc17277305288fa4f7b5f954268c25fa4fb1df23484b1f746a"} Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.923618 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-czk9d" event={"ID":"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b","Type":"ContainerStarted","Data":"e5d027d35ac788cdd3b8fd098e08727c51f90462c0daaa21863e69e95c618ca0"} Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.923645 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-czk9d" event={"ID":"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b","Type":"ContainerStarted","Data":"636e02856f68fc408d402a9217fdf886f2f4eff2957ec0831a4d981720234e08"} Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.927626 4727 generic.go:334] "Generic (PLEG): container finished" podID="02e320cc-d38f-4f37-9353-395b869f907e" containerID="d399556aa9b4c4b2c56dbe148a07ce603cd692ca35c4e36ff9c74d71683c1ab9" exitCode=0 Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.927688 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" event={"ID":"02e320cc-d38f-4f37-9353-395b869f907e","Type":"ContainerDied","Data":"d399556aa9b4c4b2c56dbe148a07ce603cd692ca35c4e36ff9c74d71683c1ab9"} Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.929596 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" event={"ID":"d18290ae-64a5-44a5-a704-90977d85852b","Type":"ContainerStarted","Data":"d15726f80d85ac871118ff8508f8fbb90331c1d082df7e96a9adc970ffc70f86"} Oct 01 12:53:08 crc kubenswrapper[4727]: I1001 12:53:08.947489 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-nkrx8" podStartSLOduration=5.947465699 podStartE2EDuration="5.947465699s" podCreationTimestamp="2025-10-01 12:53:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:53:08.939489938 +0000 UTC m=+967.260844785" watchObservedRunningTime="2025-10-01 12:53:08.947465699 +0000 UTC m=+967.268820536" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.006976 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-2qtjv" podStartSLOduration=2.78609118 podStartE2EDuration="12.006957954s" podCreationTimestamp="2025-10-01 12:52:57 +0000 UTC" firstStartedPulling="2025-10-01 12:52:58.435071375 +0000 UTC m=+956.756426212" lastFinishedPulling="2025-10-01 12:53:07.655938149 +0000 UTC m=+965.977292986" observedRunningTime="2025-10-01 12:53:08.978515278 +0000 UTC m=+967.299870115" watchObservedRunningTime="2025-10-01 12:53:09.006957954 +0000 UTC m=+967.328312781" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.028510 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.045041 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-rq4c7" podStartSLOduration=2.656122426 podStartE2EDuration="11.045020593s" podCreationTimestamp="2025-10-01 12:52:58 +0000 UTC" firstStartedPulling="2025-10-01 12:52:59.394658384 +0000 UTC m=+957.716013221" lastFinishedPulling="2025-10-01 12:53:07.783556551 +0000 UTC m=+966.104911388" observedRunningTime="2025-10-01 12:53:09.035455212 +0000 UTC m=+967.356810069" watchObservedRunningTime="2025-10-01 12:53:09.045020593 +0000 UTC m=+967.366375430" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.078973 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-czk9d" podStartSLOduration=3.078953832 podStartE2EDuration="3.078953832s" podCreationTimestamp="2025-10-01 12:53:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:53:09.072699126 +0000 UTC m=+967.394053963" watchObservedRunningTime="2025-10-01 12:53:09.078953832 +0000 UTC m=+967.400308679" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.188124 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.288640 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-74wc5"] Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.389699 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hnqr\" (UniqueName: \"kubernetes.io/projected/02e320cc-d38f-4f37-9353-395b869f907e-kube-api-access-7hnqr\") pod \"02e320cc-d38f-4f37-9353-395b869f907e\" (UID: \"02e320cc-d38f-4f37-9353-395b869f907e\") " Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.390706 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-ovsdbserver-sb\") pod \"02e320cc-d38f-4f37-9353-395b869f907e\" (UID: \"02e320cc-d38f-4f37-9353-395b869f907e\") " Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.390746 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-ovsdbserver-nb\") pod \"02e320cc-d38f-4f37-9353-395b869f907e\" (UID: \"02e320cc-d38f-4f37-9353-395b869f907e\") " Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.390774 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-config\") pod \"02e320cc-d38f-4f37-9353-395b869f907e\" (UID: \"02e320cc-d38f-4f37-9353-395b869f907e\") " Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.391753 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-dns-swift-storage-0\") pod \"02e320cc-d38f-4f37-9353-395b869f907e\" (UID: \"02e320cc-d38f-4f37-9353-395b869f907e\") " Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.392335 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-dns-svc\") pod \"02e320cc-d38f-4f37-9353-395b869f907e\" (UID: \"02e320cc-d38f-4f37-9353-395b869f907e\") " Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.396957 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02e320cc-d38f-4f37-9353-395b869f907e-kube-api-access-7hnqr" (OuterVolumeSpecName: "kube-api-access-7hnqr") pod "02e320cc-d38f-4f37-9353-395b869f907e" (UID: "02e320cc-d38f-4f37-9353-395b869f907e"). InnerVolumeSpecName "kube-api-access-7hnqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.445250 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-config" (OuterVolumeSpecName: "config") pod "02e320cc-d38f-4f37-9353-395b869f907e" (UID: "02e320cc-d38f-4f37-9353-395b869f907e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.461302 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "02e320cc-d38f-4f37-9353-395b869f907e" (UID: "02e320cc-d38f-4f37-9353-395b869f907e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.461670 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "02e320cc-d38f-4f37-9353-395b869f907e" (UID: "02e320cc-d38f-4f37-9353-395b869f907e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.474349 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 12:53:09 crc kubenswrapper[4727]: E1001 12:53:09.474772 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02e320cc-d38f-4f37-9353-395b869f907e" containerName="init" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.474797 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e320cc-d38f-4f37-9353-395b869f907e" containerName="init" Oct 01 12:53:09 crc kubenswrapper[4727]: E1001 12:53:09.474824 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02e320cc-d38f-4f37-9353-395b869f907e" containerName="dnsmasq-dns" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.474832 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e320cc-d38f-4f37-9353-395b869f907e" containerName="dnsmasq-dns" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.475046 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="02e320cc-d38f-4f37-9353-395b869f907e" containerName="dnsmasq-dns" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.476242 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.481067 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-w7frw" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.482953 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.483292 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.487832 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.498061 4727 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.498120 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.498133 4727 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.498143 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hnqr\" (UniqueName: \"kubernetes.io/projected/02e320cc-d38f-4f37-9353-395b869f907e-kube-api-access-7hnqr\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.534050 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "02e320cc-d38f-4f37-9353-395b869f907e" (UID: "02e320cc-d38f-4f37-9353-395b869f907e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.568627 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "02e320cc-d38f-4f37-9353-395b869f907e" (UID: "02e320cc-d38f-4f37-9353-395b869f907e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.599839 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7db62b86-1237-457e-91fb-3fcee6871537-logs\") pod \"glance-default-external-api-0\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.599891 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtlfb\" (UniqueName: \"kubernetes.io/projected/7db62b86-1237-457e-91fb-3fcee6871537-kube-api-access-gtlfb\") pod \"glance-default-external-api-0\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.599946 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db62b86-1237-457e-91fb-3fcee6871537-config-data\") pod \"glance-default-external-api-0\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.599962 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7db62b86-1237-457e-91fb-3fcee6871537-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.599990 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7db62b86-1237-457e-91fb-3fcee6871537-scripts\") pod \"glance-default-external-api-0\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.600132 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db62b86-1237-457e-91fb-3fcee6871537-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.600151 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.600193 4727 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.600206 4727 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02e320cc-d38f-4f37-9353-395b869f907e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.635746 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-4jv4q"] Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.704704 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db62b86-1237-457e-91fb-3fcee6871537-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.704759 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.704805 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7db62b86-1237-457e-91fb-3fcee6871537-logs\") pod \"glance-default-external-api-0\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.704834 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtlfb\" (UniqueName: \"kubernetes.io/projected/7db62b86-1237-457e-91fb-3fcee6871537-kube-api-access-gtlfb\") pod \"glance-default-external-api-0\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.704904 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db62b86-1237-457e-91fb-3fcee6871537-config-data\") pod \"glance-default-external-api-0\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.704920 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7db62b86-1237-457e-91fb-3fcee6871537-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.704955 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7db62b86-1237-457e-91fb-3fcee6871537-scripts\") pod \"glance-default-external-api-0\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.706223 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7db62b86-1237-457e-91fb-3fcee6871537-logs\") pod \"glance-default-external-api-0\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.706731 4727 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.707343 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7db62b86-1237-457e-91fb-3fcee6871537-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.710223 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7db62b86-1237-457e-91fb-3fcee6871537-scripts\") pod \"glance-default-external-api-0\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.713934 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db62b86-1237-457e-91fb-3fcee6871537-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.724442 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db62b86-1237-457e-91fb-3fcee6871537-config-data\") pod \"glance-default-external-api-0\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.735337 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtlfb\" (UniqueName: \"kubernetes.io/projected/7db62b86-1237-457e-91fb-3fcee6871537-kube-api-access-gtlfb\") pod \"glance-default-external-api-0\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.737770 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.819472 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.841751 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.843292 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.852229 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.860032 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.960229 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" event={"ID":"02e320cc-d38f-4f37-9353-395b869f907e","Type":"ContainerDied","Data":"bf5f36e3ff4cfd07e908a91326eed2973cfc1dec33663ec50451d6eac5adb746"} Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.960423 4727 scope.go:117] "RemoveContainer" containerID="d399556aa9b4c4b2c56dbe148a07ce603cd692ca35c4e36ff9c74d71683c1ab9" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.960683 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-x2drr" Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.973619 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" event={"ID":"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd","Type":"ContainerStarted","Data":"87195537dc5d93a3a59c7d63569e85d8a23fef1fbace7c73679ed54f519f1aaa"} Oct 01 12:53:09 crc kubenswrapper[4727]: I1001 12:53:09.981064 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-74wc5" event={"ID":"5746629a-ce5e-4404-8996-165034633b9e","Type":"ContainerStarted","Data":"c02577c694ac2f9289bab056b5cb5264f3bd8ac0759174045ceea294b58272cd"} Oct 01 12:53:10 crc kubenswrapper[4727]: I1001 12:53:10.012641 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-logs\") pod \"glance-default-internal-api-0\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:10 crc kubenswrapper[4727]: I1001 12:53:10.012743 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m2jq\" (UniqueName: \"kubernetes.io/projected/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-kube-api-access-2m2jq\") pod \"glance-default-internal-api-0\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:10 crc kubenswrapper[4727]: I1001 12:53:10.012828 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:10 crc kubenswrapper[4727]: I1001 12:53:10.012859 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:10 crc kubenswrapper[4727]: I1001 12:53:10.013119 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:10 crc kubenswrapper[4727]: I1001 12:53:10.013150 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:10 crc kubenswrapper[4727]: I1001 12:53:10.013238 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:10 crc kubenswrapper[4727]: I1001 12:53:10.019177 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-x2drr"] Oct 01 12:53:10 crc kubenswrapper[4727]: I1001 12:53:10.031777 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-x2drr"] Oct 01 12:53:10 crc kubenswrapper[4727]: I1001 12:53:10.114529 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:10 crc kubenswrapper[4727]: I1001 12:53:10.114639 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-logs\") pod \"glance-default-internal-api-0\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:10 crc kubenswrapper[4727]: I1001 12:53:10.114714 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m2jq\" (UniqueName: \"kubernetes.io/projected/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-kube-api-access-2m2jq\") pod \"glance-default-internal-api-0\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:10 crc kubenswrapper[4727]: I1001 12:53:10.114742 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:10 crc kubenswrapper[4727]: I1001 12:53:10.114765 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:10 crc kubenswrapper[4727]: I1001 12:53:10.114959 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:10 crc kubenswrapper[4727]: I1001 12:53:10.114986 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:10 crc kubenswrapper[4727]: I1001 12:53:10.117648 4727 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Oct 01 12:53:10 crc kubenswrapper[4727]: I1001 12:53:10.118112 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:10 crc kubenswrapper[4727]: I1001 12:53:10.118293 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-logs\") pod \"glance-default-internal-api-0\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:10 crc kubenswrapper[4727]: I1001 12:53:10.123498 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:10 crc kubenswrapper[4727]: I1001 12:53:10.124711 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:10 crc kubenswrapper[4727]: I1001 12:53:10.125697 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:10 crc kubenswrapper[4727]: I1001 12:53:10.142089 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m2jq\" (UniqueName: \"kubernetes.io/projected/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-kube-api-access-2m2jq\") pod \"glance-default-internal-api-0\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:10 crc kubenswrapper[4727]: I1001 12:53:10.156710 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:10 crc kubenswrapper[4727]: I1001 12:53:10.201059 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 12:53:10 crc kubenswrapper[4727]: I1001 12:53:10.387173 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02e320cc-d38f-4f37-9353-395b869f907e" path="/var/lib/kubelet/pods/02e320cc-d38f-4f37-9353-395b869f907e/volumes" Oct 01 12:53:11 crc kubenswrapper[4727]: I1001 12:53:11.560108 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 12:53:11 crc kubenswrapper[4727]: I1001 12:53:11.632373 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 12:53:11 crc kubenswrapper[4727]: I1001 12:53:11.802338 4727 scope.go:117] "RemoveContainer" containerID="6201cdb8b3bdfcf391a403999d27d7f25314bdbe3ad2b808f288dcf3e3ff2363" Oct 01 12:53:12 crc kubenswrapper[4727]: I1001 12:53:12.028180 4727 generic.go:334] "Generic (PLEG): container finished" podID="66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd" containerID="9d79c86d400af50f8cb425e8d0b71ec2bd4d00a0215e5caa9455a201f00afcff" exitCode=0 Oct 01 12:53:12 crc kubenswrapper[4727]: I1001 12:53:12.028277 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" event={"ID":"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd","Type":"ContainerDied","Data":"9d79c86d400af50f8cb425e8d0b71ec2bd4d00a0215e5caa9455a201f00afcff"} Oct 01 12:53:12 crc kubenswrapper[4727]: I1001 12:53:12.445262 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 12:53:12 crc kubenswrapper[4727]: W1001 12:53:12.449721 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7db62b86_1237_457e_91fb_3fcee6871537.slice/crio-12c15fb07e496644ae6df62af968da831f4616adb8250bc1a45c9aaefe6c211e WatchSource:0}: Error finding container 12c15fb07e496644ae6df62af968da831f4616adb8250bc1a45c9aaefe6c211e: Status 404 returned error can't find the container with id 12c15fb07e496644ae6df62af968da831f4616adb8250bc1a45c9aaefe6c211e Oct 01 12:53:12 crc kubenswrapper[4727]: I1001 12:53:12.546810 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 12:53:12 crc kubenswrapper[4727]: W1001 12:53:12.556233 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e2e2a1c_0e9f_4ad7_9e0a_e1d46682d31e.slice/crio-8866cb0ac21cd6bdd24abc2a0375325291a97f72936ef8e7164a872b50f265c4 WatchSource:0}: Error finding container 8866cb0ac21cd6bdd24abc2a0375325291a97f72936ef8e7164a872b50f265c4: Status 404 returned error can't find the container with id 8866cb0ac21cd6bdd24abc2a0375325291a97f72936ef8e7164a872b50f265c4 Oct 01 12:53:13 crc kubenswrapper[4727]: I1001 12:53:13.056675 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" event={"ID":"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd","Type":"ContainerStarted","Data":"86df7a77f4c956945ec38fb19ee67590cd65a17a7009d89bde5b6c75861e3052"} Oct 01 12:53:13 crc kubenswrapper[4727]: I1001 12:53:13.058782 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" Oct 01 12:53:13 crc kubenswrapper[4727]: I1001 12:53:13.062747 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25c1560a-bd40-490a-8d86-a71b9a34b7ea","Type":"ContainerStarted","Data":"8155aac14dfd4155e42e2bf5e444b03a3c02e30d5e3e12e85b5999ab62237a25"} Oct 01 12:53:13 crc kubenswrapper[4727]: I1001 12:53:13.065570 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e","Type":"ContainerStarted","Data":"8866cb0ac21cd6bdd24abc2a0375325291a97f72936ef8e7164a872b50f265c4"} Oct 01 12:53:13 crc kubenswrapper[4727]: I1001 12:53:13.068068 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7db62b86-1237-457e-91fb-3fcee6871537","Type":"ContainerStarted","Data":"12c15fb07e496644ae6df62af968da831f4616adb8250bc1a45c9aaefe6c211e"} Oct 01 12:53:13 crc kubenswrapper[4727]: I1001 12:53:13.083033 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" podStartSLOduration=5.083016021 podStartE2EDuration="5.083016021s" podCreationTimestamp="2025-10-01 12:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:53:13.079377607 +0000 UTC m=+971.400732444" watchObservedRunningTime="2025-10-01 12:53:13.083016021 +0000 UTC m=+971.404370858" Oct 01 12:53:14 crc kubenswrapper[4727]: I1001 12:53:14.079770 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7db62b86-1237-457e-91fb-3fcee6871537","Type":"ContainerStarted","Data":"1591d92610851f82acab366144f51a67d2b71b097e0ee852513ea2d367e43b2f"} Oct 01 12:53:14 crc kubenswrapper[4727]: I1001 12:53:14.083042 4727 generic.go:334] "Generic (PLEG): container finished" podID="d3f868f9-d4a4-41e8-ac05-f0a1c659e18b" containerID="e5d027d35ac788cdd3b8fd098e08727c51f90462c0daaa21863e69e95c618ca0" exitCode=0 Oct 01 12:53:14 crc kubenswrapper[4727]: I1001 12:53:14.083118 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-czk9d" event={"ID":"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b","Type":"ContainerDied","Data":"e5d027d35ac788cdd3b8fd098e08727c51f90462c0daaa21863e69e95c618ca0"} Oct 01 12:53:14 crc kubenswrapper[4727]: I1001 12:53:14.087505 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e","Type":"ContainerStarted","Data":"3f430d4178c1ed8bfa05b43cef74db794943f13485e6d9aa5fb7c8a6e71f1e9b"} Oct 01 12:53:15 crc kubenswrapper[4727]: I1001 12:53:15.123158 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7db62b86-1237-457e-91fb-3fcee6871537","Type":"ContainerStarted","Data":"ab3e7912ae964b4b2d5bbd2983f8082e4d1e899b5016429305df27998cad80b1"} Oct 01 12:53:18 crc kubenswrapper[4727]: I1001 12:53:18.147120 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7db62b86-1237-457e-91fb-3fcee6871537" containerName="glance-log" containerID="cri-o://1591d92610851f82acab366144f51a67d2b71b097e0ee852513ea2d367e43b2f" gracePeriod=30 Oct 01 12:53:18 crc kubenswrapper[4727]: I1001 12:53:18.147248 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7db62b86-1237-457e-91fb-3fcee6871537" containerName="glance-httpd" containerID="cri-o://ab3e7912ae964b4b2d5bbd2983f8082e4d1e899b5016429305df27998cad80b1" gracePeriod=30 Oct 01 12:53:18 crc kubenswrapper[4727]: I1001 12:53:18.679229 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-czk9d" Oct 01 12:53:18 crc kubenswrapper[4727]: I1001 12:53:18.698588 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=10.698566033 podStartE2EDuration="10.698566033s" podCreationTimestamp="2025-10-01 12:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:53:18.18217908 +0000 UTC m=+976.503533937" watchObservedRunningTime="2025-10-01 12:53:18.698566033 +0000 UTC m=+977.019920890" Oct 01 12:53:18 crc kubenswrapper[4727]: I1001 12:53:18.805550 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-combined-ca-bundle\") pod \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\" (UID: \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\") " Oct 01 12:53:18 crc kubenswrapper[4727]: I1001 12:53:18.805669 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-credential-keys\") pod \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\" (UID: \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\") " Oct 01 12:53:18 crc kubenswrapper[4727]: I1001 12:53:18.805707 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-fernet-keys\") pod \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\" (UID: \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\") " Oct 01 12:53:18 crc kubenswrapper[4727]: I1001 12:53:18.805731 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-config-data\") pod \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\" (UID: \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\") " Oct 01 12:53:18 crc kubenswrapper[4727]: I1001 12:53:18.805947 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8plb\" (UniqueName: \"kubernetes.io/projected/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-kube-api-access-x8plb\") pod \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\" (UID: \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\") " Oct 01 12:53:18 crc kubenswrapper[4727]: I1001 12:53:18.806027 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-scripts\") pod \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\" (UID: \"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b\") " Oct 01 12:53:18 crc kubenswrapper[4727]: I1001 12:53:18.812180 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-kube-api-access-x8plb" (OuterVolumeSpecName: "kube-api-access-x8plb") pod "d3f868f9-d4a4-41e8-ac05-f0a1c659e18b" (UID: "d3f868f9-d4a4-41e8-ac05-f0a1c659e18b"). InnerVolumeSpecName "kube-api-access-x8plb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:18 crc kubenswrapper[4727]: I1001 12:53:18.812473 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d3f868f9-d4a4-41e8-ac05-f0a1c659e18b" (UID: "d3f868f9-d4a4-41e8-ac05-f0a1c659e18b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:18 crc kubenswrapper[4727]: I1001 12:53:18.813450 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-scripts" (OuterVolumeSpecName: "scripts") pod "d3f868f9-d4a4-41e8-ac05-f0a1c659e18b" (UID: "d3f868f9-d4a4-41e8-ac05-f0a1c659e18b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:18 crc kubenswrapper[4727]: I1001 12:53:18.814253 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d3f868f9-d4a4-41e8-ac05-f0a1c659e18b" (UID: "d3f868f9-d4a4-41e8-ac05-f0a1c659e18b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:18 crc kubenswrapper[4727]: I1001 12:53:18.831489 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3f868f9-d4a4-41e8-ac05-f0a1c659e18b" (UID: "d3f868f9-d4a4-41e8-ac05-f0a1c659e18b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:18 crc kubenswrapper[4727]: I1001 12:53:18.837397 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-config-data" (OuterVolumeSpecName: "config-data") pod "d3f868f9-d4a4-41e8-ac05-f0a1c659e18b" (UID: "d3f868f9-d4a4-41e8-ac05-f0a1c659e18b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:18 crc kubenswrapper[4727]: I1001 12:53:18.907565 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:18 crc kubenswrapper[4727]: I1001 12:53:18.907605 4727 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:18 crc kubenswrapper[4727]: I1001 12:53:18.907617 4727 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:18 crc kubenswrapper[4727]: I1001 12:53:18.907628 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:18 crc kubenswrapper[4727]: I1001 12:53:18.907640 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8plb\" (UniqueName: \"kubernetes.io/projected/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-kube-api-access-x8plb\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:18 crc kubenswrapper[4727]: I1001 12:53:18.907653 4727 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:19 crc kubenswrapper[4727]: I1001 12:53:19.030161 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" Oct 01 12:53:19 crc kubenswrapper[4727]: I1001 12:53:19.081054 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-dbh85"] Oct 01 12:53:19 crc kubenswrapper[4727]: I1001 12:53:19.084516 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" podUID="aca1aa1b-b52a-447a-aa0f-771345a441c4" containerName="dnsmasq-dns" containerID="cri-o://af08112beb9a78d13612f378533eb4aa62d825792604cb0de7900927201b3d10" gracePeriod=10 Oct 01 12:53:19 crc kubenswrapper[4727]: I1001 12:53:19.158430 4727 generic.go:334] "Generic (PLEG): container finished" podID="7db62b86-1237-457e-91fb-3fcee6871537" containerID="1591d92610851f82acab366144f51a67d2b71b097e0ee852513ea2d367e43b2f" exitCode=143 Oct 01 12:53:19 crc kubenswrapper[4727]: I1001 12:53:19.158514 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7db62b86-1237-457e-91fb-3fcee6871537","Type":"ContainerDied","Data":"1591d92610851f82acab366144f51a67d2b71b097e0ee852513ea2d367e43b2f"} Oct 01 12:53:19 crc kubenswrapper[4727]: I1001 12:53:19.160536 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-czk9d" event={"ID":"d3f868f9-d4a4-41e8-ac05-f0a1c659e18b","Type":"ContainerDied","Data":"636e02856f68fc408d402a9217fdf886f2f4eff2957ec0831a4d981720234e08"} Oct 01 12:53:19 crc kubenswrapper[4727]: I1001 12:53:19.160564 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="636e02856f68fc408d402a9217fdf886f2f4eff2957ec0831a4d981720234e08" Oct 01 12:53:19 crc kubenswrapper[4727]: I1001 12:53:19.160619 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-czk9d" Oct 01 12:53:19 crc kubenswrapper[4727]: I1001 12:53:19.841442 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7f4d7ff84c-l27rl"] Oct 01 12:53:19 crc kubenswrapper[4727]: E1001 12:53:19.842189 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3f868f9-d4a4-41e8-ac05-f0a1c659e18b" containerName="keystone-bootstrap" Oct 01 12:53:19 crc kubenswrapper[4727]: I1001 12:53:19.842204 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f868f9-d4a4-41e8-ac05-f0a1c659e18b" containerName="keystone-bootstrap" Oct 01 12:53:19 crc kubenswrapper[4727]: I1001 12:53:19.842362 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3f868f9-d4a4-41e8-ac05-f0a1c659e18b" containerName="keystone-bootstrap" Oct 01 12:53:19 crc kubenswrapper[4727]: I1001 12:53:19.842932 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:19 crc kubenswrapper[4727]: I1001 12:53:19.861776 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7f4d7ff84c-l27rl"] Oct 01 12:53:19 crc kubenswrapper[4727]: I1001 12:53:19.886960 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 12:53:19 crc kubenswrapper[4727]: I1001 12:53:19.887022 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 12:53:19 crc kubenswrapper[4727]: I1001 12:53:19.887201 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 12:53:19 crc kubenswrapper[4727]: I1001 12:53:19.888089 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 01 12:53:19 crc kubenswrapper[4727]: I1001 12:53:19.890931 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 01 12:53:19 crc kubenswrapper[4727]: I1001 12:53:19.891219 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jxtdc" Oct 01 12:53:19 crc kubenswrapper[4727]: I1001 12:53:19.926342 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/80e31dea-5550-409a-8f5e-5eec07106dcd-internal-tls-certs\") pod \"keystone-7f4d7ff84c-l27rl\" (UID: \"80e31dea-5550-409a-8f5e-5eec07106dcd\") " pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:19 crc kubenswrapper[4727]: I1001 12:53:19.926423 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80e31dea-5550-409a-8f5e-5eec07106dcd-scripts\") pod \"keystone-7f4d7ff84c-l27rl\" (UID: \"80e31dea-5550-409a-8f5e-5eec07106dcd\") " pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:19 crc kubenswrapper[4727]: I1001 12:53:19.926508 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzrfz\" (UniqueName: \"kubernetes.io/projected/80e31dea-5550-409a-8f5e-5eec07106dcd-kube-api-access-fzrfz\") pod \"keystone-7f4d7ff84c-l27rl\" (UID: \"80e31dea-5550-409a-8f5e-5eec07106dcd\") " pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:19 crc kubenswrapper[4727]: I1001 12:53:19.926581 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e31dea-5550-409a-8f5e-5eec07106dcd-combined-ca-bundle\") pod \"keystone-7f4d7ff84c-l27rl\" (UID: \"80e31dea-5550-409a-8f5e-5eec07106dcd\") " pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:19 crc kubenswrapper[4727]: I1001 12:53:19.926628 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80e31dea-5550-409a-8f5e-5eec07106dcd-config-data\") pod \"keystone-7f4d7ff84c-l27rl\" (UID: \"80e31dea-5550-409a-8f5e-5eec07106dcd\") " pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:19 crc kubenswrapper[4727]: I1001 12:53:19.926872 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/80e31dea-5550-409a-8f5e-5eec07106dcd-credential-keys\") pod \"keystone-7f4d7ff84c-l27rl\" (UID: \"80e31dea-5550-409a-8f5e-5eec07106dcd\") " pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:19 crc kubenswrapper[4727]: I1001 12:53:19.926973 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80e31dea-5550-409a-8f5e-5eec07106dcd-public-tls-certs\") pod \"keystone-7f4d7ff84c-l27rl\" (UID: \"80e31dea-5550-409a-8f5e-5eec07106dcd\") " pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:19 crc kubenswrapper[4727]: I1001 12:53:19.927065 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/80e31dea-5550-409a-8f5e-5eec07106dcd-fernet-keys\") pod \"keystone-7f4d7ff84c-l27rl\" (UID: \"80e31dea-5550-409a-8f5e-5eec07106dcd\") " pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:20 crc kubenswrapper[4727]: I1001 12:53:20.028668 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/80e31dea-5550-409a-8f5e-5eec07106dcd-fernet-keys\") pod \"keystone-7f4d7ff84c-l27rl\" (UID: \"80e31dea-5550-409a-8f5e-5eec07106dcd\") " pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:20 crc kubenswrapper[4727]: I1001 12:53:20.028808 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/80e31dea-5550-409a-8f5e-5eec07106dcd-internal-tls-certs\") pod \"keystone-7f4d7ff84c-l27rl\" (UID: \"80e31dea-5550-409a-8f5e-5eec07106dcd\") " pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:20 crc kubenswrapper[4727]: I1001 12:53:20.028843 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80e31dea-5550-409a-8f5e-5eec07106dcd-scripts\") pod \"keystone-7f4d7ff84c-l27rl\" (UID: \"80e31dea-5550-409a-8f5e-5eec07106dcd\") " pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:20 crc kubenswrapper[4727]: I1001 12:53:20.029680 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzrfz\" (UniqueName: \"kubernetes.io/projected/80e31dea-5550-409a-8f5e-5eec07106dcd-kube-api-access-fzrfz\") pod \"keystone-7f4d7ff84c-l27rl\" (UID: \"80e31dea-5550-409a-8f5e-5eec07106dcd\") " pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:20 crc kubenswrapper[4727]: I1001 12:53:20.029738 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e31dea-5550-409a-8f5e-5eec07106dcd-combined-ca-bundle\") pod \"keystone-7f4d7ff84c-l27rl\" (UID: \"80e31dea-5550-409a-8f5e-5eec07106dcd\") " pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:20 crc kubenswrapper[4727]: I1001 12:53:20.029771 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80e31dea-5550-409a-8f5e-5eec07106dcd-config-data\") pod \"keystone-7f4d7ff84c-l27rl\" (UID: \"80e31dea-5550-409a-8f5e-5eec07106dcd\") " pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:20 crc kubenswrapper[4727]: I1001 12:53:20.029842 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/80e31dea-5550-409a-8f5e-5eec07106dcd-credential-keys\") pod \"keystone-7f4d7ff84c-l27rl\" (UID: \"80e31dea-5550-409a-8f5e-5eec07106dcd\") " pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:20 crc kubenswrapper[4727]: I1001 12:53:20.029885 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80e31dea-5550-409a-8f5e-5eec07106dcd-public-tls-certs\") pod \"keystone-7f4d7ff84c-l27rl\" (UID: \"80e31dea-5550-409a-8f5e-5eec07106dcd\") " pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:20 crc kubenswrapper[4727]: I1001 12:53:20.035351 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80e31dea-5550-409a-8f5e-5eec07106dcd-public-tls-certs\") pod \"keystone-7f4d7ff84c-l27rl\" (UID: \"80e31dea-5550-409a-8f5e-5eec07106dcd\") " pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:20 crc kubenswrapper[4727]: I1001 12:53:20.035477 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/80e31dea-5550-409a-8f5e-5eec07106dcd-credential-keys\") pod \"keystone-7f4d7ff84c-l27rl\" (UID: \"80e31dea-5550-409a-8f5e-5eec07106dcd\") " pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:20 crc kubenswrapper[4727]: I1001 12:53:20.035984 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/80e31dea-5550-409a-8f5e-5eec07106dcd-internal-tls-certs\") pod \"keystone-7f4d7ff84c-l27rl\" (UID: \"80e31dea-5550-409a-8f5e-5eec07106dcd\") " pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:20 crc kubenswrapper[4727]: I1001 12:53:20.037048 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80e31dea-5550-409a-8f5e-5eec07106dcd-scripts\") pod \"keystone-7f4d7ff84c-l27rl\" (UID: \"80e31dea-5550-409a-8f5e-5eec07106dcd\") " pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:20 crc kubenswrapper[4727]: I1001 12:53:20.038497 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e31dea-5550-409a-8f5e-5eec07106dcd-combined-ca-bundle\") pod \"keystone-7f4d7ff84c-l27rl\" (UID: \"80e31dea-5550-409a-8f5e-5eec07106dcd\") " pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:20 crc kubenswrapper[4727]: I1001 12:53:20.047107 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80e31dea-5550-409a-8f5e-5eec07106dcd-config-data\") pod \"keystone-7f4d7ff84c-l27rl\" (UID: \"80e31dea-5550-409a-8f5e-5eec07106dcd\") " pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:20 crc kubenswrapper[4727]: I1001 12:53:20.048184 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzrfz\" (UniqueName: \"kubernetes.io/projected/80e31dea-5550-409a-8f5e-5eec07106dcd-kube-api-access-fzrfz\") pod \"keystone-7f4d7ff84c-l27rl\" (UID: \"80e31dea-5550-409a-8f5e-5eec07106dcd\") " pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:20 crc kubenswrapper[4727]: I1001 12:53:20.049341 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/80e31dea-5550-409a-8f5e-5eec07106dcd-fernet-keys\") pod \"keystone-7f4d7ff84c-l27rl\" (UID: \"80e31dea-5550-409a-8f5e-5eec07106dcd\") " pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:20 crc kubenswrapper[4727]: I1001 12:53:20.199935 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:20 crc kubenswrapper[4727]: I1001 12:53:20.626590 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7f4d7ff84c-l27rl"] Oct 01 12:53:20 crc kubenswrapper[4727]: W1001 12:53:20.632027 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80e31dea_5550_409a_8f5e_5eec07106dcd.slice/crio-72d60c5a31d34816aa19eeb9443d7bcca2a690e2afd827ae17bdd67799bcfe1b WatchSource:0}: Error finding container 72d60c5a31d34816aa19eeb9443d7bcca2a690e2afd827ae17bdd67799bcfe1b: Status 404 returned error can't find the container with id 72d60c5a31d34816aa19eeb9443d7bcca2a690e2afd827ae17bdd67799bcfe1b Oct 01 12:53:21 crc kubenswrapper[4727]: I1001 12:53:21.179562 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7f4d7ff84c-l27rl" event={"ID":"80e31dea-5550-409a-8f5e-5eec07106dcd","Type":"ContainerStarted","Data":"72d60c5a31d34816aa19eeb9443d7bcca2a690e2afd827ae17bdd67799bcfe1b"} Oct 01 12:53:22 crc kubenswrapper[4727]: I1001 12:53:22.701036 4727 generic.go:334] "Generic (PLEG): container finished" podID="7db62b86-1237-457e-91fb-3fcee6871537" containerID="ab3e7912ae964b4b2d5bbd2983f8082e4d1e899b5016429305df27998cad80b1" exitCode=0 Oct 01 12:53:22 crc kubenswrapper[4727]: I1001 12:53:22.701361 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7db62b86-1237-457e-91fb-3fcee6871537","Type":"ContainerDied","Data":"ab3e7912ae964b4b2d5bbd2983f8082e4d1e899b5016429305df27998cad80b1"} Oct 01 12:53:22 crc kubenswrapper[4727]: I1001 12:53:22.806629 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" podUID="aca1aa1b-b52a-447a-aa0f-771345a441c4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: connect: connection refused" Oct 01 12:53:23 crc kubenswrapper[4727]: I1001 12:53:23.711875 4727 generic.go:334] "Generic (PLEG): container finished" podID="aca1aa1b-b52a-447a-aa0f-771345a441c4" containerID="af08112beb9a78d13612f378533eb4aa62d825792604cb0de7900927201b3d10" exitCode=0 Oct 01 12:53:23 crc kubenswrapper[4727]: I1001 12:53:23.712267 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" event={"ID":"aca1aa1b-b52a-447a-aa0f-771345a441c4","Type":"ContainerDied","Data":"af08112beb9a78d13612f378533eb4aa62d825792604cb0de7900927201b3d10"} Oct 01 12:53:23 crc kubenswrapper[4727]: I1001 12:53:23.714852 4727 generic.go:334] "Generic (PLEG): container finished" podID="4eb9c560-a9b2-4243-b59a-b40142e48739" containerID="5972f220b28b5bac03f6c034fe2fbae406896dcbdd6967f9eb5538b6ffdc638f" exitCode=0 Oct 01 12:53:23 crc kubenswrapper[4727]: I1001 12:53:23.714907 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2qtjv" event={"ID":"4eb9c560-a9b2-4243-b59a-b40142e48739","Type":"ContainerDied","Data":"5972f220b28b5bac03f6c034fe2fbae406896dcbdd6967f9eb5538b6ffdc638f"} Oct 01 12:53:23 crc kubenswrapper[4727]: I1001 12:53:23.717161 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e","Type":"ContainerStarted","Data":"341898e00a7ccaef92cf5f0322d69b7a89d64664b60d5a51e2b552b7da137d95"} Oct 01 12:53:23 crc kubenswrapper[4727]: I1001 12:53:23.717302 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e" containerName="glance-log" containerID="cri-o://3f430d4178c1ed8bfa05b43cef74db794943f13485e6d9aa5fb7c8a6e71f1e9b" gracePeriod=30 Oct 01 12:53:23 crc kubenswrapper[4727]: I1001 12:53:23.717621 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e" containerName="glance-httpd" containerID="cri-o://341898e00a7ccaef92cf5f0322d69b7a89d64664b60d5a51e2b552b7da137d95" gracePeriod=30 Oct 01 12:53:23 crc kubenswrapper[4727]: I1001 12:53:23.720557 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7f4d7ff84c-l27rl" event={"ID":"80e31dea-5550-409a-8f5e-5eec07106dcd","Type":"ContainerStarted","Data":"a141314ab622989ee9494e9714726b8c9edce599fe9df41856a377a949f4c229"} Oct 01 12:53:23 crc kubenswrapper[4727]: I1001 12:53:23.721288 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:23 crc kubenswrapper[4727]: I1001 12:53:23.770664 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7f4d7ff84c-l27rl" podStartSLOduration=4.770640568 podStartE2EDuration="4.770640568s" podCreationTimestamp="2025-10-01 12:53:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:53:23.758424447 +0000 UTC m=+982.079779284" watchObservedRunningTime="2025-10-01 12:53:23.770640568 +0000 UTC m=+982.091995415" Oct 01 12:53:23 crc kubenswrapper[4727]: I1001 12:53:23.780157 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=15.780139905 podStartE2EDuration="15.780139905s" podCreationTimestamp="2025-10-01 12:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:53:23.778246666 +0000 UTC m=+982.099601523" watchObservedRunningTime="2025-10-01 12:53:23.780139905 +0000 UTC m=+982.101494752" Oct 01 12:53:24 crc kubenswrapper[4727]: I1001 12:53:24.734921 4727 generic.go:334] "Generic (PLEG): container finished" podID="7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e" containerID="341898e00a7ccaef92cf5f0322d69b7a89d64664b60d5a51e2b552b7da137d95" exitCode=0 Oct 01 12:53:24 crc kubenswrapper[4727]: I1001 12:53:24.734955 4727 generic.go:334] "Generic (PLEG): container finished" podID="7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e" containerID="3f430d4178c1ed8bfa05b43cef74db794943f13485e6d9aa5fb7c8a6e71f1e9b" exitCode=143 Oct 01 12:53:24 crc kubenswrapper[4727]: I1001 12:53:24.735091 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e","Type":"ContainerDied","Data":"341898e00a7ccaef92cf5f0322d69b7a89d64664b60d5a51e2b552b7da137d95"} Oct 01 12:53:24 crc kubenswrapper[4727]: I1001 12:53:24.735130 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e","Type":"ContainerDied","Data":"3f430d4178c1ed8bfa05b43cef74db794943f13485e6d9aa5fb7c8a6e71f1e9b"} Oct 01 12:53:25 crc kubenswrapper[4727]: I1001 12:53:25.749543 4727 generic.go:334] "Generic (PLEG): container finished" podID="087bee3f-a34f-43ca-ac4b-b3e46e068898" containerID="60bc5b53364f3a052513adb74535bdb4845204bbd4e01a6a62500eeac82e9b0b" exitCode=0 Oct 01 12:53:25 crc kubenswrapper[4727]: I1001 12:53:25.749618 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rq4c7" event={"ID":"087bee3f-a34f-43ca-ac4b-b3e46e068898","Type":"ContainerDied","Data":"60bc5b53364f3a052513adb74535bdb4845204bbd4e01a6a62500eeac82e9b0b"} Oct 01 12:53:31 crc kubenswrapper[4727]: E1001 12:53:31.972367 4727 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 01 12:53:31 crc kubenswrapper[4727]: E1001 12:53:31.973293 4727 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l22hk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-74wc5_openstack(5746629a-ce5e-4404-8996-165034633b9e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 12:53:31 crc kubenswrapper[4727]: E1001 12:53:31.974862 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-74wc5" podUID="5746629a-ce5e-4404-8996-165034633b9e" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.037356 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.045715 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2qtjv" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.053639 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4eb9c560-a9b2-4243-b59a-b40142e48739-scripts\") pod \"4eb9c560-a9b2-4243-b59a-b40142e48739\" (UID: \"4eb9c560-a9b2-4243-b59a-b40142e48739\") " Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.053689 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db62b86-1237-457e-91fb-3fcee6871537-config-data\") pod \"7db62b86-1237-457e-91fb-3fcee6871537\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.053753 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7db62b86-1237-457e-91fb-3fcee6871537-scripts\") pod \"7db62b86-1237-457e-91fb-3fcee6871537\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.053778 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eb9c560-a9b2-4243-b59a-b40142e48739-combined-ca-bundle\") pod \"4eb9c560-a9b2-4243-b59a-b40142e48739\" (UID: \"4eb9c560-a9b2-4243-b59a-b40142e48739\") " Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.053816 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eb9c560-a9b2-4243-b59a-b40142e48739-logs\") pod \"4eb9c560-a9b2-4243-b59a-b40142e48739\" (UID: \"4eb9c560-a9b2-4243-b59a-b40142e48739\") " Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.053837 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db62b86-1237-457e-91fb-3fcee6871537-combined-ca-bundle\") pod \"7db62b86-1237-457e-91fb-3fcee6871537\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.053901 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eb9c560-a9b2-4243-b59a-b40142e48739-config-data\") pod \"4eb9c560-a9b2-4243-b59a-b40142e48739\" (UID: \"4eb9c560-a9b2-4243-b59a-b40142e48739\") " Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.053953 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtlfb\" (UniqueName: \"kubernetes.io/projected/7db62b86-1237-457e-91fb-3fcee6871537-kube-api-access-gtlfb\") pod \"7db62b86-1237-457e-91fb-3fcee6871537\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.054007 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmjtd\" (UniqueName: \"kubernetes.io/projected/4eb9c560-a9b2-4243-b59a-b40142e48739-kube-api-access-xmjtd\") pod \"4eb9c560-a9b2-4243-b59a-b40142e48739\" (UID: \"4eb9c560-a9b2-4243-b59a-b40142e48739\") " Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.054026 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7db62b86-1237-457e-91fb-3fcee6871537-logs\") pod \"7db62b86-1237-457e-91fb-3fcee6871537\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.054062 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"7db62b86-1237-457e-91fb-3fcee6871537\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.054105 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7db62b86-1237-457e-91fb-3fcee6871537-httpd-run\") pod \"7db62b86-1237-457e-91fb-3fcee6871537\" (UID: \"7db62b86-1237-457e-91fb-3fcee6871537\") " Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.055492 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7db62b86-1237-457e-91fb-3fcee6871537-logs" (OuterVolumeSpecName: "logs") pod "7db62b86-1237-457e-91fb-3fcee6871537" (UID: "7db62b86-1237-457e-91fb-3fcee6871537"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.056642 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7db62b86-1237-457e-91fb-3fcee6871537-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7db62b86-1237-457e-91fb-3fcee6871537" (UID: "7db62b86-1237-457e-91fb-3fcee6871537"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.057909 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eb9c560-a9b2-4243-b59a-b40142e48739-logs" (OuterVolumeSpecName: "logs") pod "4eb9c560-a9b2-4243-b59a-b40142e48739" (UID: "4eb9c560-a9b2-4243-b59a-b40142e48739"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.062838 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "7db62b86-1237-457e-91fb-3fcee6871537" (UID: "7db62b86-1237-457e-91fb-3fcee6871537"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.063692 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7db62b86-1237-457e-91fb-3fcee6871537-kube-api-access-gtlfb" (OuterVolumeSpecName: "kube-api-access-gtlfb") pod "7db62b86-1237-457e-91fb-3fcee6871537" (UID: "7db62b86-1237-457e-91fb-3fcee6871537"). InnerVolumeSpecName "kube-api-access-gtlfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.069515 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eb9c560-a9b2-4243-b59a-b40142e48739-kube-api-access-xmjtd" (OuterVolumeSpecName: "kube-api-access-xmjtd") pod "4eb9c560-a9b2-4243-b59a-b40142e48739" (UID: "4eb9c560-a9b2-4243-b59a-b40142e48739"). InnerVolumeSpecName "kube-api-access-xmjtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.078552 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7db62b86-1237-457e-91fb-3fcee6871537-scripts" (OuterVolumeSpecName: "scripts") pod "7db62b86-1237-457e-91fb-3fcee6871537" (UID: "7db62b86-1237-457e-91fb-3fcee6871537"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.078693 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eb9c560-a9b2-4243-b59a-b40142e48739-scripts" (OuterVolumeSpecName: "scripts") pod "4eb9c560-a9b2-4243-b59a-b40142e48739" (UID: "4eb9c560-a9b2-4243-b59a-b40142e48739"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.117702 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eb9c560-a9b2-4243-b59a-b40142e48739-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4eb9c560-a9b2-4243-b59a-b40142e48739" (UID: "4eb9c560-a9b2-4243-b59a-b40142e48739"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.119328 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eb9c560-a9b2-4243-b59a-b40142e48739-config-data" (OuterVolumeSpecName: "config-data") pod "4eb9c560-a9b2-4243-b59a-b40142e48739" (UID: "4eb9c560-a9b2-4243-b59a-b40142e48739"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.120758 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7db62b86-1237-457e-91fb-3fcee6871537-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7db62b86-1237-457e-91fb-3fcee6871537" (UID: "7db62b86-1237-457e-91fb-3fcee6871537"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.155765 4727 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7db62b86-1237-457e-91fb-3fcee6871537-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.155801 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eb9c560-a9b2-4243-b59a-b40142e48739-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.155814 4727 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eb9c560-a9b2-4243-b59a-b40142e48739-logs\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.155822 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db62b86-1237-457e-91fb-3fcee6871537-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.155831 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eb9c560-a9b2-4243-b59a-b40142e48739-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.155841 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtlfb\" (UniqueName: \"kubernetes.io/projected/7db62b86-1237-457e-91fb-3fcee6871537-kube-api-access-gtlfb\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.155851 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmjtd\" (UniqueName: \"kubernetes.io/projected/4eb9c560-a9b2-4243-b59a-b40142e48739-kube-api-access-xmjtd\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.155861 4727 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7db62b86-1237-457e-91fb-3fcee6871537-logs\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.155891 4727 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.155909 4727 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7db62b86-1237-457e-91fb-3fcee6871537-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.155919 4727 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4eb9c560-a9b2-4243-b59a-b40142e48739-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.173601 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7db62b86-1237-457e-91fb-3fcee6871537-config-data" (OuterVolumeSpecName: "config-data") pod "7db62b86-1237-457e-91fb-3fcee6871537" (UID: "7db62b86-1237-457e-91fb-3fcee6871537"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.176890 4727 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.257389 4727 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.257452 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db62b86-1237-457e-91fb-3fcee6871537-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.807795 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" podUID="aca1aa1b-b52a-447a-aa0f-771345a441c4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: i/o timeout" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.819159 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7db62b86-1237-457e-91fb-3fcee6871537","Type":"ContainerDied","Data":"12c15fb07e496644ae6df62af968da831f4616adb8250bc1a45c9aaefe6c211e"} Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.819210 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.819212 4727 scope.go:117] "RemoveContainer" containerID="ab3e7912ae964b4b2d5bbd2983f8082e4d1e899b5016429305df27998cad80b1" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.821721 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2qtjv" event={"ID":"4eb9c560-a9b2-4243-b59a-b40142e48739","Type":"ContainerDied","Data":"3285cece2bdf68c89e745038a61cc1ac1734b0d0c25e2789b0df2636d5a66ffe"} Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.821759 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3285cece2bdf68c89e745038a61cc1ac1734b0d0c25e2789b0df2636d5a66ffe" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.821816 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2qtjv" Oct 01 12:53:32 crc kubenswrapper[4727]: E1001 12:53:32.824402 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-74wc5" podUID="5746629a-ce5e-4404-8996-165034633b9e" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.870623 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.886229 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.901040 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 12:53:32 crc kubenswrapper[4727]: E1001 12:53:32.901487 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7db62b86-1237-457e-91fb-3fcee6871537" containerName="glance-log" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.901505 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db62b86-1237-457e-91fb-3fcee6871537" containerName="glance-log" Oct 01 12:53:32 crc kubenswrapper[4727]: E1001 12:53:32.901526 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb9c560-a9b2-4243-b59a-b40142e48739" containerName="placement-db-sync" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.901533 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb9c560-a9b2-4243-b59a-b40142e48739" containerName="placement-db-sync" Oct 01 12:53:32 crc kubenswrapper[4727]: E1001 12:53:32.901542 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7db62b86-1237-457e-91fb-3fcee6871537" containerName="glance-httpd" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.901549 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db62b86-1237-457e-91fb-3fcee6871537" containerName="glance-httpd" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.901737 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="7db62b86-1237-457e-91fb-3fcee6871537" containerName="glance-log" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.901758 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="7db62b86-1237-457e-91fb-3fcee6871537" containerName="glance-httpd" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.901771 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eb9c560-a9b2-4243-b59a-b40142e48739" containerName="placement-db-sync" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.902780 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.910169 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.910349 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 01 12:53:32 crc kubenswrapper[4727]: I1001 12:53:32.920421 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.073288 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6468c22-4086-4884-a045-9f77a1b459d6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.073356 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6468c22-4086-4884-a045-9f77a1b459d6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.073410 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6468c22-4086-4884-a045-9f77a1b459d6-logs\") pod \"glance-default-external-api-0\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.073429 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6468c22-4086-4884-a045-9f77a1b459d6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.073462 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6468c22-4086-4884-a045-9f77a1b459d6-scripts\") pod \"glance-default-external-api-0\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.073497 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6468c22-4086-4884-a045-9f77a1b459d6-config-data\") pod \"glance-default-external-api-0\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.073542 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqslh\" (UniqueName: \"kubernetes.io/projected/a6468c22-4086-4884-a045-9f77a1b459d6-kube-api-access-hqslh\") pod \"glance-default-external-api-0\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.073576 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.174890 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6468c22-4086-4884-a045-9f77a1b459d6-logs\") pod \"glance-default-external-api-0\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.174946 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6468c22-4086-4884-a045-9f77a1b459d6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.174987 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6468c22-4086-4884-a045-9f77a1b459d6-scripts\") pod \"glance-default-external-api-0\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.175053 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6468c22-4086-4884-a045-9f77a1b459d6-config-data\") pod \"glance-default-external-api-0\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.175101 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqslh\" (UniqueName: \"kubernetes.io/projected/a6468c22-4086-4884-a045-9f77a1b459d6-kube-api-access-hqslh\") pod \"glance-default-external-api-0\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.175149 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.175210 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6468c22-4086-4884-a045-9f77a1b459d6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.175248 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6468c22-4086-4884-a045-9f77a1b459d6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.176204 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6468c22-4086-4884-a045-9f77a1b459d6-logs\") pod \"glance-default-external-api-0\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.176411 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6468c22-4086-4884-a045-9f77a1b459d6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.176793 4727 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.183277 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6468c22-4086-4884-a045-9f77a1b459d6-config-data\") pod \"glance-default-external-api-0\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.191742 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6468c22-4086-4884-a045-9f77a1b459d6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.196890 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6468c22-4086-4884-a045-9f77a1b459d6-scripts\") pod \"glance-default-external-api-0\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.201306 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6468c22-4086-4884-a045-9f77a1b459d6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.243311 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.249826 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqslh\" (UniqueName: \"kubernetes.io/projected/a6468c22-4086-4884-a045-9f77a1b459d6-kube-api-access-hqslh\") pod \"glance-default-external-api-0\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " pod="openstack/glance-default-external-api-0" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.310708 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-648455799b-c8jzs"] Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.313119 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-648455799b-c8jzs" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.316714 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.317196 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.317410 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.317557 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-kfxjq" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.321313 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.386107 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-648455799b-c8jzs"] Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.481941 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87998a25-3079-49a0-93da-d4326ed0ccc3-logs\") pod \"placement-648455799b-c8jzs\" (UID: \"87998a25-3079-49a0-93da-d4326ed0ccc3\") " pod="openstack/placement-648455799b-c8jzs" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.481990 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/87998a25-3079-49a0-93da-d4326ed0ccc3-internal-tls-certs\") pod \"placement-648455799b-c8jzs\" (UID: \"87998a25-3079-49a0-93da-d4326ed0ccc3\") " pod="openstack/placement-648455799b-c8jzs" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.482029 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m88n\" (UniqueName: \"kubernetes.io/projected/87998a25-3079-49a0-93da-d4326ed0ccc3-kube-api-access-5m88n\") pod \"placement-648455799b-c8jzs\" (UID: \"87998a25-3079-49a0-93da-d4326ed0ccc3\") " pod="openstack/placement-648455799b-c8jzs" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.482048 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87998a25-3079-49a0-93da-d4326ed0ccc3-combined-ca-bundle\") pod \"placement-648455799b-c8jzs\" (UID: \"87998a25-3079-49a0-93da-d4326ed0ccc3\") " pod="openstack/placement-648455799b-c8jzs" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.482113 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/87998a25-3079-49a0-93da-d4326ed0ccc3-public-tls-certs\") pod \"placement-648455799b-c8jzs\" (UID: \"87998a25-3079-49a0-93da-d4326ed0ccc3\") " pod="openstack/placement-648455799b-c8jzs" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.482262 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87998a25-3079-49a0-93da-d4326ed0ccc3-scripts\") pod \"placement-648455799b-c8jzs\" (UID: \"87998a25-3079-49a0-93da-d4326ed0ccc3\") " pod="openstack/placement-648455799b-c8jzs" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.482335 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87998a25-3079-49a0-93da-d4326ed0ccc3-config-data\") pod \"placement-648455799b-c8jzs\" (UID: \"87998a25-3079-49a0-93da-d4326ed0ccc3\") " pod="openstack/placement-648455799b-c8jzs" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.530869 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.584212 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87998a25-3079-49a0-93da-d4326ed0ccc3-config-data\") pod \"placement-648455799b-c8jzs\" (UID: \"87998a25-3079-49a0-93da-d4326ed0ccc3\") " pod="openstack/placement-648455799b-c8jzs" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.585536 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87998a25-3079-49a0-93da-d4326ed0ccc3-logs\") pod \"placement-648455799b-c8jzs\" (UID: \"87998a25-3079-49a0-93da-d4326ed0ccc3\") " pod="openstack/placement-648455799b-c8jzs" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.585593 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/87998a25-3079-49a0-93da-d4326ed0ccc3-internal-tls-certs\") pod \"placement-648455799b-c8jzs\" (UID: \"87998a25-3079-49a0-93da-d4326ed0ccc3\") " pod="openstack/placement-648455799b-c8jzs" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.585638 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m88n\" (UniqueName: \"kubernetes.io/projected/87998a25-3079-49a0-93da-d4326ed0ccc3-kube-api-access-5m88n\") pod \"placement-648455799b-c8jzs\" (UID: \"87998a25-3079-49a0-93da-d4326ed0ccc3\") " pod="openstack/placement-648455799b-c8jzs" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.585673 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87998a25-3079-49a0-93da-d4326ed0ccc3-combined-ca-bundle\") pod \"placement-648455799b-c8jzs\" (UID: \"87998a25-3079-49a0-93da-d4326ed0ccc3\") " pod="openstack/placement-648455799b-c8jzs" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.585734 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/87998a25-3079-49a0-93da-d4326ed0ccc3-public-tls-certs\") pod \"placement-648455799b-c8jzs\" (UID: \"87998a25-3079-49a0-93da-d4326ed0ccc3\") " pod="openstack/placement-648455799b-c8jzs" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.585896 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87998a25-3079-49a0-93da-d4326ed0ccc3-scripts\") pod \"placement-648455799b-c8jzs\" (UID: \"87998a25-3079-49a0-93da-d4326ed0ccc3\") " pod="openstack/placement-648455799b-c8jzs" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.585962 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87998a25-3079-49a0-93da-d4326ed0ccc3-logs\") pod \"placement-648455799b-c8jzs\" (UID: \"87998a25-3079-49a0-93da-d4326ed0ccc3\") " pod="openstack/placement-648455799b-c8jzs" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.589368 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87998a25-3079-49a0-93da-d4326ed0ccc3-combined-ca-bundle\") pod \"placement-648455799b-c8jzs\" (UID: \"87998a25-3079-49a0-93da-d4326ed0ccc3\") " pod="openstack/placement-648455799b-c8jzs" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.590069 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87998a25-3079-49a0-93da-d4326ed0ccc3-scripts\") pod \"placement-648455799b-c8jzs\" (UID: \"87998a25-3079-49a0-93da-d4326ed0ccc3\") " pod="openstack/placement-648455799b-c8jzs" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.590944 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87998a25-3079-49a0-93da-d4326ed0ccc3-config-data\") pod \"placement-648455799b-c8jzs\" (UID: \"87998a25-3079-49a0-93da-d4326ed0ccc3\") " pod="openstack/placement-648455799b-c8jzs" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.591665 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/87998a25-3079-49a0-93da-d4326ed0ccc3-public-tls-certs\") pod \"placement-648455799b-c8jzs\" (UID: \"87998a25-3079-49a0-93da-d4326ed0ccc3\") " pod="openstack/placement-648455799b-c8jzs" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.592479 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/87998a25-3079-49a0-93da-d4326ed0ccc3-internal-tls-certs\") pod \"placement-648455799b-c8jzs\" (UID: \"87998a25-3079-49a0-93da-d4326ed0ccc3\") " pod="openstack/placement-648455799b-c8jzs" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.608973 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m88n\" (UniqueName: \"kubernetes.io/projected/87998a25-3079-49a0-93da-d4326ed0ccc3-kube-api-access-5m88n\") pod \"placement-648455799b-c8jzs\" (UID: \"87998a25-3079-49a0-93da-d4326ed0ccc3\") " pod="openstack/placement-648455799b-c8jzs" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.668044 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-648455799b-c8jzs" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.754923 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.762395 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rq4c7" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.830979 4727 scope.go:117] "RemoveContainer" containerID="1591d92610851f82acab366144f51a67d2b71b097e0ee852513ea2d367e43b2f" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.843514 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rq4c7" event={"ID":"087bee3f-a34f-43ca-ac4b-b3e46e068898","Type":"ContainerDied","Data":"0eedd0e3bdbed97a0116bc3e467603f85a3ef30bcb6f55078aca79eba54d2a10"} Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.843554 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0eedd0e3bdbed97a0116bc3e467603f85a3ef30bcb6f55078aca79eba54d2a10" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.843618 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rq4c7" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.849145 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" event={"ID":"aca1aa1b-b52a-447a-aa0f-771345a441c4","Type":"ContainerDied","Data":"b612d444d2145887f0b1a50a062913b523397dc52876ae3ca1b2f5041984105f"} Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.849223 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.893755 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/087bee3f-a34f-43ca-ac4b-b3e46e068898-combined-ca-bundle\") pod \"087bee3f-a34f-43ca-ac4b-b3e46e068898\" (UID: \"087bee3f-a34f-43ca-ac4b-b3e46e068898\") " Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.893836 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-dns-swift-storage-0\") pod \"aca1aa1b-b52a-447a-aa0f-771345a441c4\" (UID: \"aca1aa1b-b52a-447a-aa0f-771345a441c4\") " Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.893910 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-ovsdbserver-nb\") pod \"aca1aa1b-b52a-447a-aa0f-771345a441c4\" (UID: \"aca1aa1b-b52a-447a-aa0f-771345a441c4\") " Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.893953 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-ovsdbserver-sb\") pod \"aca1aa1b-b52a-447a-aa0f-771345a441c4\" (UID: \"aca1aa1b-b52a-447a-aa0f-771345a441c4\") " Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.894020 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-config\") pod \"aca1aa1b-b52a-447a-aa0f-771345a441c4\" (UID: \"aca1aa1b-b52a-447a-aa0f-771345a441c4\") " Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.894072 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/087bee3f-a34f-43ca-ac4b-b3e46e068898-db-sync-config-data\") pod \"087bee3f-a34f-43ca-ac4b-b3e46e068898\" (UID: \"087bee3f-a34f-43ca-ac4b-b3e46e068898\") " Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.894100 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4blsk\" (UniqueName: \"kubernetes.io/projected/087bee3f-a34f-43ca-ac4b-b3e46e068898-kube-api-access-4blsk\") pod \"087bee3f-a34f-43ca-ac4b-b3e46e068898\" (UID: \"087bee3f-a34f-43ca-ac4b-b3e46e068898\") " Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.894807 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4hqr\" (UniqueName: \"kubernetes.io/projected/aca1aa1b-b52a-447a-aa0f-771345a441c4-kube-api-access-p4hqr\") pod \"aca1aa1b-b52a-447a-aa0f-771345a441c4\" (UID: \"aca1aa1b-b52a-447a-aa0f-771345a441c4\") " Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.894891 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-dns-svc\") pod \"aca1aa1b-b52a-447a-aa0f-771345a441c4\" (UID: \"aca1aa1b-b52a-447a-aa0f-771345a441c4\") " Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.911722 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/087bee3f-a34f-43ca-ac4b-b3e46e068898-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "087bee3f-a34f-43ca-ac4b-b3e46e068898" (UID: "087bee3f-a34f-43ca-ac4b-b3e46e068898"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.917198 4727 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/087bee3f-a34f-43ca-ac4b-b3e46e068898-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.924310 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aca1aa1b-b52a-447a-aa0f-771345a441c4-kube-api-access-p4hqr" (OuterVolumeSpecName: "kube-api-access-p4hqr") pod "aca1aa1b-b52a-447a-aa0f-771345a441c4" (UID: "aca1aa1b-b52a-447a-aa0f-771345a441c4"). InnerVolumeSpecName "kube-api-access-p4hqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.942783 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/087bee3f-a34f-43ca-ac4b-b3e46e068898-kube-api-access-4blsk" (OuterVolumeSpecName: "kube-api-access-4blsk") pod "087bee3f-a34f-43ca-ac4b-b3e46e068898" (UID: "087bee3f-a34f-43ca-ac4b-b3e46e068898"). InnerVolumeSpecName "kube-api-access-4blsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.977503 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/087bee3f-a34f-43ca-ac4b-b3e46e068898-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "087bee3f-a34f-43ca-ac4b-b3e46e068898" (UID: "087bee3f-a34f-43ca-ac4b-b3e46e068898"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.991112 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aca1aa1b-b52a-447a-aa0f-771345a441c4" (UID: "aca1aa1b-b52a-447a-aa0f-771345a441c4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.997114 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aca1aa1b-b52a-447a-aa0f-771345a441c4" (UID: "aca1aa1b-b52a-447a-aa0f-771345a441c4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.997204 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aca1aa1b-b52a-447a-aa0f-771345a441c4" (UID: "aca1aa1b-b52a-447a-aa0f-771345a441c4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:33 crc kubenswrapper[4727]: I1001 12:53:33.998943 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-config" (OuterVolumeSpecName: "config") pod "aca1aa1b-b52a-447a-aa0f-771345a441c4" (UID: "aca1aa1b-b52a-447a-aa0f-771345a441c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:34 crc kubenswrapper[4727]: I1001 12:53:34.010996 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aca1aa1b-b52a-447a-aa0f-771345a441c4" (UID: "aca1aa1b-b52a-447a-aa0f-771345a441c4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:34 crc kubenswrapper[4727]: I1001 12:53:34.019171 4727 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:34 crc kubenswrapper[4727]: I1001 12:53:34.019215 4727 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:34 crc kubenswrapper[4727]: I1001 12:53:34.019226 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:34 crc kubenswrapper[4727]: I1001 12:53:34.019238 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4blsk\" (UniqueName: \"kubernetes.io/projected/087bee3f-a34f-43ca-ac4b-b3e46e068898-kube-api-access-4blsk\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:34 crc kubenswrapper[4727]: I1001 12:53:34.019251 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4hqr\" (UniqueName: \"kubernetes.io/projected/aca1aa1b-b52a-447a-aa0f-771345a441c4-kube-api-access-p4hqr\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:34 crc kubenswrapper[4727]: I1001 12:53:34.019259 4727 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:34 crc kubenswrapper[4727]: I1001 12:53:34.019266 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/087bee3f-a34f-43ca-ac4b-b3e46e068898-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:34 crc kubenswrapper[4727]: I1001 12:53:34.019274 4727 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aca1aa1b-b52a-447a-aa0f-771345a441c4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:34 crc kubenswrapper[4727]: I1001 12:53:34.203992 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-dbh85"] Oct 01 12:53:34 crc kubenswrapper[4727]: I1001 12:53:34.212256 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-dbh85"] Oct 01 12:53:34 crc kubenswrapper[4727]: I1001 12:53:34.382571 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7db62b86-1237-457e-91fb-3fcee6871537" path="/var/lib/kubelet/pods/7db62b86-1237-457e-91fb-3fcee6871537/volumes" Oct 01 12:53:34 crc kubenswrapper[4727]: I1001 12:53:34.383333 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aca1aa1b-b52a-447a-aa0f-771345a441c4" path="/var/lib/kubelet/pods/aca1aa1b-b52a-447a-aa0f-771345a441c4/volumes" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.116489 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7485b6955d-qw7r5"] Oct 01 12:53:35 crc kubenswrapper[4727]: E1001 12:53:35.117189 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aca1aa1b-b52a-447a-aa0f-771345a441c4" containerName="dnsmasq-dns" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.117218 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="aca1aa1b-b52a-447a-aa0f-771345a441c4" containerName="dnsmasq-dns" Oct 01 12:53:35 crc kubenswrapper[4727]: E1001 12:53:35.117246 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aca1aa1b-b52a-447a-aa0f-771345a441c4" containerName="init" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.117256 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="aca1aa1b-b52a-447a-aa0f-771345a441c4" containerName="init" Oct 01 12:53:35 crc kubenswrapper[4727]: E1001 12:53:35.117267 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="087bee3f-a34f-43ca-ac4b-b3e46e068898" containerName="barbican-db-sync" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.117274 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="087bee3f-a34f-43ca-ac4b-b3e46e068898" containerName="barbican-db-sync" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.117548 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="aca1aa1b-b52a-447a-aa0f-771345a441c4" containerName="dnsmasq-dns" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.117581 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="087bee3f-a34f-43ca-ac4b-b3e46e068898" containerName="barbican-db-sync" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.118938 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7485b6955d-qw7r5" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.129467 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.129859 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-8l9cb" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.130070 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.133607 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7485b6955d-qw7r5"] Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.169456 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7d4477b597-fvt5b"] Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.172111 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7d4477b597-fvt5b" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.175910 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.225644 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7d4477b597-fvt5b"] Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.248333 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14e20fde-b925-4ef7-b5f8-4b6a50544990-logs\") pod \"barbican-keystone-listener-7485b6955d-qw7r5\" (UID: \"14e20fde-b925-4ef7-b5f8-4b6a50544990\") " pod="openstack/barbican-keystone-listener-7485b6955d-qw7r5" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.248395 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/14e20fde-b925-4ef7-b5f8-4b6a50544990-config-data-custom\") pod \"barbican-keystone-listener-7485b6955d-qw7r5\" (UID: \"14e20fde-b925-4ef7-b5f8-4b6a50544990\") " pod="openstack/barbican-keystone-listener-7485b6955d-qw7r5" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.248432 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14e20fde-b925-4ef7-b5f8-4b6a50544990-config-data\") pod \"barbican-keystone-listener-7485b6955d-qw7r5\" (UID: \"14e20fde-b925-4ef7-b5f8-4b6a50544990\") " pod="openstack/barbican-keystone-listener-7485b6955d-qw7r5" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.248573 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e20fde-b925-4ef7-b5f8-4b6a50544990-combined-ca-bundle\") pod \"barbican-keystone-listener-7485b6955d-qw7r5\" (UID: \"14e20fde-b925-4ef7-b5f8-4b6a50544990\") " pod="openstack/barbican-keystone-listener-7485b6955d-qw7r5" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.248636 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4czbt\" (UniqueName: \"kubernetes.io/projected/14e20fde-b925-4ef7-b5f8-4b6a50544990-kube-api-access-4czbt\") pod \"barbican-keystone-listener-7485b6955d-qw7r5\" (UID: \"14e20fde-b925-4ef7-b5f8-4b6a50544990\") " pod="openstack/barbican-keystone-listener-7485b6955d-qw7r5" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.265131 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-68dh7"] Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.267403 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.308750 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-68dh7"] Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.355642 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9-logs\") pod \"barbican-worker-7d4477b597-fvt5b\" (UID: \"0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9\") " pod="openstack/barbican-worker-7d4477b597-fvt5b" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.355717 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e20fde-b925-4ef7-b5f8-4b6a50544990-combined-ca-bundle\") pod \"barbican-keystone-listener-7485b6955d-qw7r5\" (UID: \"14e20fde-b925-4ef7-b5f8-4b6a50544990\") " pod="openstack/barbican-keystone-listener-7485b6955d-qw7r5" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.355763 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-dns-swift-storage-0\") pod \"dnsmasq-dns-59d5ff467f-68dh7\" (UID: \"17dd087d-9162-4d17-84fb-49bbcb2c542e\") " pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.355797 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4czbt\" (UniqueName: \"kubernetes.io/projected/14e20fde-b925-4ef7-b5f8-4b6a50544990-kube-api-access-4czbt\") pod \"barbican-keystone-listener-7485b6955d-qw7r5\" (UID: \"14e20fde-b925-4ef7-b5f8-4b6a50544990\") " pod="openstack/barbican-keystone-listener-7485b6955d-qw7r5" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.355828 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-config\") pod \"dnsmasq-dns-59d5ff467f-68dh7\" (UID: \"17dd087d-9162-4d17-84fb-49bbcb2c542e\") " pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.355875 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrcsr\" (UniqueName: \"kubernetes.io/projected/0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9-kube-api-access-zrcsr\") pod \"barbican-worker-7d4477b597-fvt5b\" (UID: \"0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9\") " pod="openstack/barbican-worker-7d4477b597-fvt5b" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.355910 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9-config-data-custom\") pod \"barbican-worker-7d4477b597-fvt5b\" (UID: \"0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9\") " pod="openstack/barbican-worker-7d4477b597-fvt5b" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.355952 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-dns-svc\") pod \"dnsmasq-dns-59d5ff467f-68dh7\" (UID: \"17dd087d-9162-4d17-84fb-49bbcb2c542e\") " pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.355985 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5ff467f-68dh7\" (UID: \"17dd087d-9162-4d17-84fb-49bbcb2c542e\") " pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.356028 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14e20fde-b925-4ef7-b5f8-4b6a50544990-logs\") pod \"barbican-keystone-listener-7485b6955d-qw7r5\" (UID: \"14e20fde-b925-4ef7-b5f8-4b6a50544990\") " pod="openstack/barbican-keystone-listener-7485b6955d-qw7r5" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.356062 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/14e20fde-b925-4ef7-b5f8-4b6a50544990-config-data-custom\") pod \"barbican-keystone-listener-7485b6955d-qw7r5\" (UID: \"14e20fde-b925-4ef7-b5f8-4b6a50544990\") " pod="openstack/barbican-keystone-listener-7485b6955d-qw7r5" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.356097 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14e20fde-b925-4ef7-b5f8-4b6a50544990-config-data\") pod \"barbican-keystone-listener-7485b6955d-qw7r5\" (UID: \"14e20fde-b925-4ef7-b5f8-4b6a50544990\") " pod="openstack/barbican-keystone-listener-7485b6955d-qw7r5" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.356172 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g948\" (UniqueName: \"kubernetes.io/projected/17dd087d-9162-4d17-84fb-49bbcb2c542e-kube-api-access-4g948\") pod \"dnsmasq-dns-59d5ff467f-68dh7\" (UID: \"17dd087d-9162-4d17-84fb-49bbcb2c542e\") " pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.356220 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5ff467f-68dh7\" (UID: \"17dd087d-9162-4d17-84fb-49bbcb2c542e\") " pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.356255 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9-combined-ca-bundle\") pod \"barbican-worker-7d4477b597-fvt5b\" (UID: \"0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9\") " pod="openstack/barbican-worker-7d4477b597-fvt5b" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.356297 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9-config-data\") pod \"barbican-worker-7d4477b597-fvt5b\" (UID: \"0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9\") " pod="openstack/barbican-worker-7d4477b597-fvt5b" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.360768 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14e20fde-b925-4ef7-b5f8-4b6a50544990-logs\") pod \"barbican-keystone-listener-7485b6955d-qw7r5\" (UID: \"14e20fde-b925-4ef7-b5f8-4b6a50544990\") " pod="openstack/barbican-keystone-listener-7485b6955d-qw7r5" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.369732 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14e20fde-b925-4ef7-b5f8-4b6a50544990-config-data\") pod \"barbican-keystone-listener-7485b6955d-qw7r5\" (UID: \"14e20fde-b925-4ef7-b5f8-4b6a50544990\") " pod="openstack/barbican-keystone-listener-7485b6955d-qw7r5" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.371149 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e20fde-b925-4ef7-b5f8-4b6a50544990-combined-ca-bundle\") pod \"barbican-keystone-listener-7485b6955d-qw7r5\" (UID: \"14e20fde-b925-4ef7-b5f8-4b6a50544990\") " pod="openstack/barbican-keystone-listener-7485b6955d-qw7r5" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.389873 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/14e20fde-b925-4ef7-b5f8-4b6a50544990-config-data-custom\") pod \"barbican-keystone-listener-7485b6955d-qw7r5\" (UID: \"14e20fde-b925-4ef7-b5f8-4b6a50544990\") " pod="openstack/barbican-keystone-listener-7485b6955d-qw7r5" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.397503 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4czbt\" (UniqueName: \"kubernetes.io/projected/14e20fde-b925-4ef7-b5f8-4b6a50544990-kube-api-access-4czbt\") pod \"barbican-keystone-listener-7485b6955d-qw7r5\" (UID: \"14e20fde-b925-4ef7-b5f8-4b6a50544990\") " pod="openstack/barbican-keystone-listener-7485b6955d-qw7r5" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.454206 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7c46b9b5c6-gmpgq"] Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.456188 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c46b9b5c6-gmpgq" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.459482 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9-config-data\") pod \"barbican-worker-7d4477b597-fvt5b\" (UID: \"0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9\") " pod="openstack/barbican-worker-7d4477b597-fvt5b" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.459737 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9-logs\") pod \"barbican-worker-7d4477b597-fvt5b\" (UID: \"0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9\") " pod="openstack/barbican-worker-7d4477b597-fvt5b" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.459904 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-dns-swift-storage-0\") pod \"dnsmasq-dns-59d5ff467f-68dh7\" (UID: \"17dd087d-9162-4d17-84fb-49bbcb2c542e\") " pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.460028 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-config\") pod \"dnsmasq-dns-59d5ff467f-68dh7\" (UID: \"17dd087d-9162-4d17-84fb-49bbcb2c542e\") " pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.460332 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrcsr\" (UniqueName: \"kubernetes.io/projected/0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9-kube-api-access-zrcsr\") pod \"barbican-worker-7d4477b597-fvt5b\" (UID: \"0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9\") " pod="openstack/barbican-worker-7d4477b597-fvt5b" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.464623 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-dns-swift-storage-0\") pod \"dnsmasq-dns-59d5ff467f-68dh7\" (UID: \"17dd087d-9162-4d17-84fb-49bbcb2c542e\") " pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.464562 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.464715 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9-config-data-custom\") pod \"barbican-worker-7d4477b597-fvt5b\" (UID: \"0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9\") " pod="openstack/barbican-worker-7d4477b597-fvt5b" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.464720 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9-logs\") pod \"barbican-worker-7d4477b597-fvt5b\" (UID: \"0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9\") " pod="openstack/barbican-worker-7d4477b597-fvt5b" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.465146 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-config\") pod \"dnsmasq-dns-59d5ff467f-68dh7\" (UID: \"17dd087d-9162-4d17-84fb-49bbcb2c542e\") " pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.465961 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-dns-svc\") pod \"dnsmasq-dns-59d5ff467f-68dh7\" (UID: \"17dd087d-9162-4d17-84fb-49bbcb2c542e\") " pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.465336 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-dns-svc\") pod \"dnsmasq-dns-59d5ff467f-68dh7\" (UID: \"17dd087d-9162-4d17-84fb-49bbcb2c542e\") " pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.466310 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5ff467f-68dh7\" (UID: \"17dd087d-9162-4d17-84fb-49bbcb2c542e\") " pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.468083 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5ff467f-68dh7\" (UID: \"17dd087d-9162-4d17-84fb-49bbcb2c542e\") " pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.468329 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g948\" (UniqueName: \"kubernetes.io/projected/17dd087d-9162-4d17-84fb-49bbcb2c542e-kube-api-access-4g948\") pod \"dnsmasq-dns-59d5ff467f-68dh7\" (UID: \"17dd087d-9162-4d17-84fb-49bbcb2c542e\") " pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.469622 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5ff467f-68dh7\" (UID: \"17dd087d-9162-4d17-84fb-49bbcb2c542e\") " pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.469692 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9-combined-ca-bundle\") pod \"barbican-worker-7d4477b597-fvt5b\" (UID: \"0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9\") " pod="openstack/barbican-worker-7d4477b597-fvt5b" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.470838 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7485b6955d-qw7r5" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.471186 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9-config-data\") pod \"barbican-worker-7d4477b597-fvt5b\" (UID: \"0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9\") " pod="openstack/barbican-worker-7d4477b597-fvt5b" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.471251 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5ff467f-68dh7\" (UID: \"17dd087d-9162-4d17-84fb-49bbcb2c542e\") " pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.472938 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9-config-data-custom\") pod \"barbican-worker-7d4477b597-fvt5b\" (UID: \"0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9\") " pod="openstack/barbican-worker-7d4477b597-fvt5b" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.473646 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9-combined-ca-bundle\") pod \"barbican-worker-7d4477b597-fvt5b\" (UID: \"0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9\") " pod="openstack/barbican-worker-7d4477b597-fvt5b" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.477143 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c46b9b5c6-gmpgq"] Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.491966 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrcsr\" (UniqueName: \"kubernetes.io/projected/0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9-kube-api-access-zrcsr\") pod \"barbican-worker-7d4477b597-fvt5b\" (UID: \"0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9\") " pod="openstack/barbican-worker-7d4477b597-fvt5b" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.493257 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g948\" (UniqueName: \"kubernetes.io/projected/17dd087d-9162-4d17-84fb-49bbcb2c542e-kube-api-access-4g948\") pod \"dnsmasq-dns-59d5ff467f-68dh7\" (UID: \"17dd087d-9162-4d17-84fb-49bbcb2c542e\") " pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.493648 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7d4477b597-fvt5b" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.572958 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/122ed690-1a2a-4989-ae98-c9009df8bb95-config-data-custom\") pod \"barbican-api-7c46b9b5c6-gmpgq\" (UID: \"122ed690-1a2a-4989-ae98-c9009df8bb95\") " pod="openstack/barbican-api-7c46b9b5c6-gmpgq" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.573100 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/122ed690-1a2a-4989-ae98-c9009df8bb95-config-data\") pod \"barbican-api-7c46b9b5c6-gmpgq\" (UID: \"122ed690-1a2a-4989-ae98-c9009df8bb95\") " pod="openstack/barbican-api-7c46b9b5c6-gmpgq" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.573155 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/122ed690-1a2a-4989-ae98-c9009df8bb95-logs\") pod \"barbican-api-7c46b9b5c6-gmpgq\" (UID: \"122ed690-1a2a-4989-ae98-c9009df8bb95\") " pod="openstack/barbican-api-7c46b9b5c6-gmpgq" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.573607 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntg8m\" (UniqueName: \"kubernetes.io/projected/122ed690-1a2a-4989-ae98-c9009df8bb95-kube-api-access-ntg8m\") pod \"barbican-api-7c46b9b5c6-gmpgq\" (UID: \"122ed690-1a2a-4989-ae98-c9009df8bb95\") " pod="openstack/barbican-api-7c46b9b5c6-gmpgq" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.573820 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122ed690-1a2a-4989-ae98-c9009df8bb95-combined-ca-bundle\") pod \"barbican-api-7c46b9b5c6-gmpgq\" (UID: \"122ed690-1a2a-4989-ae98-c9009df8bb95\") " pod="openstack/barbican-api-7c46b9b5c6-gmpgq" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.600383 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.676507 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122ed690-1a2a-4989-ae98-c9009df8bb95-combined-ca-bundle\") pod \"barbican-api-7c46b9b5c6-gmpgq\" (UID: \"122ed690-1a2a-4989-ae98-c9009df8bb95\") " pod="openstack/barbican-api-7c46b9b5c6-gmpgq" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.677125 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/122ed690-1a2a-4989-ae98-c9009df8bb95-config-data-custom\") pod \"barbican-api-7c46b9b5c6-gmpgq\" (UID: \"122ed690-1a2a-4989-ae98-c9009df8bb95\") " pod="openstack/barbican-api-7c46b9b5c6-gmpgq" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.677181 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/122ed690-1a2a-4989-ae98-c9009df8bb95-config-data\") pod \"barbican-api-7c46b9b5c6-gmpgq\" (UID: \"122ed690-1a2a-4989-ae98-c9009df8bb95\") " pod="openstack/barbican-api-7c46b9b5c6-gmpgq" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.677202 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/122ed690-1a2a-4989-ae98-c9009df8bb95-logs\") pod \"barbican-api-7c46b9b5c6-gmpgq\" (UID: \"122ed690-1a2a-4989-ae98-c9009df8bb95\") " pod="openstack/barbican-api-7c46b9b5c6-gmpgq" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.677243 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntg8m\" (UniqueName: \"kubernetes.io/projected/122ed690-1a2a-4989-ae98-c9009df8bb95-kube-api-access-ntg8m\") pod \"barbican-api-7c46b9b5c6-gmpgq\" (UID: \"122ed690-1a2a-4989-ae98-c9009df8bb95\") " pod="openstack/barbican-api-7c46b9b5c6-gmpgq" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.681775 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/122ed690-1a2a-4989-ae98-c9009df8bb95-logs\") pod \"barbican-api-7c46b9b5c6-gmpgq\" (UID: \"122ed690-1a2a-4989-ae98-c9009df8bb95\") " pod="openstack/barbican-api-7c46b9b5c6-gmpgq" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.684214 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122ed690-1a2a-4989-ae98-c9009df8bb95-combined-ca-bundle\") pod \"barbican-api-7c46b9b5c6-gmpgq\" (UID: \"122ed690-1a2a-4989-ae98-c9009df8bb95\") " pod="openstack/barbican-api-7c46b9b5c6-gmpgq" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.685894 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/122ed690-1a2a-4989-ae98-c9009df8bb95-config-data\") pod \"barbican-api-7c46b9b5c6-gmpgq\" (UID: \"122ed690-1a2a-4989-ae98-c9009df8bb95\") " pod="openstack/barbican-api-7c46b9b5c6-gmpgq" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.697612 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntg8m\" (UniqueName: \"kubernetes.io/projected/122ed690-1a2a-4989-ae98-c9009df8bb95-kube-api-access-ntg8m\") pod \"barbican-api-7c46b9b5c6-gmpgq\" (UID: \"122ed690-1a2a-4989-ae98-c9009df8bb95\") " pod="openstack/barbican-api-7c46b9b5c6-gmpgq" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.702507 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/122ed690-1a2a-4989-ae98-c9009df8bb95-config-data-custom\") pod \"barbican-api-7c46b9b5c6-gmpgq\" (UID: \"122ed690-1a2a-4989-ae98-c9009df8bb95\") " pod="openstack/barbican-api-7c46b9b5c6-gmpgq" Oct 01 12:53:35 crc kubenswrapper[4727]: I1001 12:53:35.857249 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c46b9b5c6-gmpgq" Oct 01 12:53:37 crc kubenswrapper[4727]: E1001 12:53:37.236270 4727 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/sg-core:latest" Oct 01 12:53:37 crc kubenswrapper[4727]: E1001 12:53:37.238027 4727 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:sg-core,Image:quay.io/openstack-k8s-operators/sg-core:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:sg-core-conf-yaml,ReadOnly:false,MountPath:/etc/sg-core.conf.yaml,SubPath:sg-core.conf.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kxhcc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(25c1560a-bd40-490a-8d86-a71b9a34b7ea): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.265895 4727 scope.go:117] "RemoveContainer" containerID="af08112beb9a78d13612f378533eb4aa62d825792604cb0de7900927201b3d10" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.449348 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.459314 4727 scope.go:117] "RemoveContainer" containerID="0eef0ba817e3e4b9f43b2c2ee52f9911b07421103e5fb1db0adcc46cc283374c" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.621041 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-config-data\") pod \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.621079 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.621110 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-scripts\") pod \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.621148 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m2jq\" (UniqueName: \"kubernetes.io/projected/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-kube-api-access-2m2jq\") pod \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.621211 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-logs\") pod \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.621324 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-combined-ca-bundle\") pod \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.621361 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-httpd-run\") pod \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\" (UID: \"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e\") " Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.622017 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-logs" (OuterVolumeSpecName: "logs") pod "7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e" (UID: "7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.622034 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e" (UID: "7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.629389 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-kube-api-access-2m2jq" (OuterVolumeSpecName: "kube-api-access-2m2jq") pod "7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e" (UID: "7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e"). InnerVolumeSpecName "kube-api-access-2m2jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.631366 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e" (UID: "7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.635347 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-scripts" (OuterVolumeSpecName: "scripts") pod "7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e" (UID: "7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.723039 4727 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.723092 4727 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.723106 4727 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.723119 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m2jq\" (UniqueName: \"kubernetes.io/projected/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-kube-api-access-2m2jq\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.723132 4727 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-logs\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.729821 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-config-data" (OuterVolumeSpecName: "config-data") pod "7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e" (UID: "7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.731367 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e" (UID: "7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.748867 4727 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.808877 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-76fcf4b695-dbh85" podUID="aca1aa1b-b52a-447a-aa0f-771345a441c4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.137:5353: i/o timeout" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.825984 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.826092 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.826105 4727 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.831455 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6977f5dffd-4k4rv"] Oct 01 12:53:37 crc kubenswrapper[4727]: E1001 12:53:37.832019 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e" containerName="glance-log" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.832043 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e" containerName="glance-log" Oct 01 12:53:37 crc kubenswrapper[4727]: E1001 12:53:37.832067 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e" containerName="glance-httpd" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.832077 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e" containerName="glance-httpd" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.832331 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e" containerName="glance-httpd" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.832384 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e" containerName="glance-log" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.833384 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6977f5dffd-4k4rv" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.836487 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.836692 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.853940 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6977f5dffd-4k4rv"] Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.874832 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-648455799b-c8jzs"] Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.890468 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-648455799b-c8jzs" event={"ID":"87998a25-3079-49a0-93da-d4326ed0ccc3","Type":"ContainerStarted","Data":"95791d5a234003dfb392a119a145d38c36fff3d7878231d4782b012ba078c4bd"} Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.896796 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.896816 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e","Type":"ContainerDied","Data":"8866cb0ac21cd6bdd24abc2a0375325291a97f72936ef8e7164a872b50f265c4"} Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.896940 4727 scope.go:117] "RemoveContainer" containerID="341898e00a7ccaef92cf5f0322d69b7a89d64664b60d5a51e2b552b7da137d95" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.928673 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/81bb6eec-986d-4589-b007-dc88fcf1832b-public-tls-certs\") pod \"barbican-api-6977f5dffd-4k4rv\" (UID: \"81bb6eec-986d-4589-b007-dc88fcf1832b\") " pod="openstack/barbican-api-6977f5dffd-4k4rv" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.928735 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81bb6eec-986d-4589-b007-dc88fcf1832b-config-data-custom\") pod \"barbican-api-6977f5dffd-4k4rv\" (UID: \"81bb6eec-986d-4589-b007-dc88fcf1832b\") " pod="openstack/barbican-api-6977f5dffd-4k4rv" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.928774 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qr2w\" (UniqueName: \"kubernetes.io/projected/81bb6eec-986d-4589-b007-dc88fcf1832b-kube-api-access-7qr2w\") pod \"barbican-api-6977f5dffd-4k4rv\" (UID: \"81bb6eec-986d-4589-b007-dc88fcf1832b\") " pod="openstack/barbican-api-6977f5dffd-4k4rv" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.928813 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81bb6eec-986d-4589-b007-dc88fcf1832b-internal-tls-certs\") pod \"barbican-api-6977f5dffd-4k4rv\" (UID: \"81bb6eec-986d-4589-b007-dc88fcf1832b\") " pod="openstack/barbican-api-6977f5dffd-4k4rv" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.928975 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81bb6eec-986d-4589-b007-dc88fcf1832b-logs\") pod \"barbican-api-6977f5dffd-4k4rv\" (UID: \"81bb6eec-986d-4589-b007-dc88fcf1832b\") " pod="openstack/barbican-api-6977f5dffd-4k4rv" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.929050 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81bb6eec-986d-4589-b007-dc88fcf1832b-config-data\") pod \"barbican-api-6977f5dffd-4k4rv\" (UID: \"81bb6eec-986d-4589-b007-dc88fcf1832b\") " pod="openstack/barbican-api-6977f5dffd-4k4rv" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.929081 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81bb6eec-986d-4589-b007-dc88fcf1832b-combined-ca-bundle\") pod \"barbican-api-6977f5dffd-4k4rv\" (UID: \"81bb6eec-986d-4589-b007-dc88fcf1832b\") " pod="openstack/barbican-api-6977f5dffd-4k4rv" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.951052 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.959141 4727 scope.go:117] "RemoveContainer" containerID="3f430d4178c1ed8bfa05b43cef74db794943f13485e6d9aa5fb7c8a6e71f1e9b" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.966755 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.976284 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.977925 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.986685 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.988454 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 01 12:53:37 crc kubenswrapper[4727]: I1001 12:53:37.993195 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.009458 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-68dh7"] Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.030539 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81bb6eec-986d-4589-b007-dc88fcf1832b-internal-tls-certs\") pod \"barbican-api-6977f5dffd-4k4rv\" (UID: \"81bb6eec-986d-4589-b007-dc88fcf1832b\") " pod="openstack/barbican-api-6977f5dffd-4k4rv" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.030615 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81bb6eec-986d-4589-b007-dc88fcf1832b-logs\") pod \"barbican-api-6977f5dffd-4k4rv\" (UID: \"81bb6eec-986d-4589-b007-dc88fcf1832b\") " pod="openstack/barbican-api-6977f5dffd-4k4rv" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.030646 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81bb6eec-986d-4589-b007-dc88fcf1832b-config-data\") pod \"barbican-api-6977f5dffd-4k4rv\" (UID: \"81bb6eec-986d-4589-b007-dc88fcf1832b\") " pod="openstack/barbican-api-6977f5dffd-4k4rv" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.030666 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81bb6eec-986d-4589-b007-dc88fcf1832b-combined-ca-bundle\") pod \"barbican-api-6977f5dffd-4k4rv\" (UID: \"81bb6eec-986d-4589-b007-dc88fcf1832b\") " pod="openstack/barbican-api-6977f5dffd-4k4rv" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.030734 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/81bb6eec-986d-4589-b007-dc88fcf1832b-public-tls-certs\") pod \"barbican-api-6977f5dffd-4k4rv\" (UID: \"81bb6eec-986d-4589-b007-dc88fcf1832b\") " pod="openstack/barbican-api-6977f5dffd-4k4rv" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.030762 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81bb6eec-986d-4589-b007-dc88fcf1832b-config-data-custom\") pod \"barbican-api-6977f5dffd-4k4rv\" (UID: \"81bb6eec-986d-4589-b007-dc88fcf1832b\") " pod="openstack/barbican-api-6977f5dffd-4k4rv" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.030785 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qr2w\" (UniqueName: \"kubernetes.io/projected/81bb6eec-986d-4589-b007-dc88fcf1832b-kube-api-access-7qr2w\") pod \"barbican-api-6977f5dffd-4k4rv\" (UID: \"81bb6eec-986d-4589-b007-dc88fcf1832b\") " pod="openstack/barbican-api-6977f5dffd-4k4rv" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.031507 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81bb6eec-986d-4589-b007-dc88fcf1832b-logs\") pod \"barbican-api-6977f5dffd-4k4rv\" (UID: \"81bb6eec-986d-4589-b007-dc88fcf1832b\") " pod="openstack/barbican-api-6977f5dffd-4k4rv" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.034911 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81bb6eec-986d-4589-b007-dc88fcf1832b-internal-tls-certs\") pod \"barbican-api-6977f5dffd-4k4rv\" (UID: \"81bb6eec-986d-4589-b007-dc88fcf1832b\") " pod="openstack/barbican-api-6977f5dffd-4k4rv" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.040916 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81bb6eec-986d-4589-b007-dc88fcf1832b-config-data-custom\") pod \"barbican-api-6977f5dffd-4k4rv\" (UID: \"81bb6eec-986d-4589-b007-dc88fcf1832b\") " pod="openstack/barbican-api-6977f5dffd-4k4rv" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.040985 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81bb6eec-986d-4589-b007-dc88fcf1832b-config-data\") pod \"barbican-api-6977f5dffd-4k4rv\" (UID: \"81bb6eec-986d-4589-b007-dc88fcf1832b\") " pod="openstack/barbican-api-6977f5dffd-4k4rv" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.041659 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81bb6eec-986d-4589-b007-dc88fcf1832b-combined-ca-bundle\") pod \"barbican-api-6977f5dffd-4k4rv\" (UID: \"81bb6eec-986d-4589-b007-dc88fcf1832b\") " pod="openstack/barbican-api-6977f5dffd-4k4rv" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.041671 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/81bb6eec-986d-4589-b007-dc88fcf1832b-public-tls-certs\") pod \"barbican-api-6977f5dffd-4k4rv\" (UID: \"81bb6eec-986d-4589-b007-dc88fcf1832b\") " pod="openstack/barbican-api-6977f5dffd-4k4rv" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.051584 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qr2w\" (UniqueName: \"kubernetes.io/projected/81bb6eec-986d-4589-b007-dc88fcf1832b-kube-api-access-7qr2w\") pod \"barbican-api-6977f5dffd-4k4rv\" (UID: \"81bb6eec-986d-4589-b007-dc88fcf1832b\") " pod="openstack/barbican-api-6977f5dffd-4k4rv" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.132115 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.132188 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.132211 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwtbn\" (UniqueName: \"kubernetes.io/projected/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-kube-api-access-xwtbn\") pod \"glance-default-internal-api-0\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.132258 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.132281 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.132309 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-logs\") pod \"glance-default-internal-api-0\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.132562 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.132643 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.164406 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6977f5dffd-4k4rv" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.232755 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7485b6955d-qw7r5"] Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.233839 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.233926 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.233986 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.234080 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwtbn\" (UniqueName: \"kubernetes.io/projected/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-kube-api-access-xwtbn\") pod \"glance-default-internal-api-0\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.234126 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.234148 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.234177 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-logs\") pod \"glance-default-internal-api-0\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.234229 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.238155 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.238495 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-logs\") pod \"glance-default-internal-api-0\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.238512 4727 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.239267 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.240100 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.241165 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7d4477b597-fvt5b"] Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.242560 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.244313 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.251644 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c46b9b5c6-gmpgq"] Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.258200 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwtbn\" (UniqueName: \"kubernetes.io/projected/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-kube-api-access-xwtbn\") pod \"glance-default-internal-api-0\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:38 crc kubenswrapper[4727]: W1001 12:53:38.262324 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14e20fde_b925_4ef7_b5f8_4b6a50544990.slice/crio-0254b7cd5b8c12e88530b550bb86ad8f31d19245f88a011058122b0c8a19411e WatchSource:0}: Error finding container 0254b7cd5b8c12e88530b550bb86ad8f31d19245f88a011058122b0c8a19411e: Status 404 returned error can't find the container with id 0254b7cd5b8c12e88530b550bb86ad8f31d19245f88a011058122b0c8a19411e Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.279170 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.321602 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.365536 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.386939 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e" path="/var/lib/kubelet/pods/7e2e2a1c-0e9f-4ad7-9e0a-e1d46682d31e/volumes" Oct 01 12:53:38 crc kubenswrapper[4727]: W1001 12:53:38.415175 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6468c22_4086_4884_a045_9f77a1b459d6.slice/crio-286e311f987d344ba7b01214f5cdde355f74bb75d799de3cc2f70b4e2dcb8ef4 WatchSource:0}: Error finding container 286e311f987d344ba7b01214f5cdde355f74bb75d799de3cc2f70b4e2dcb8ef4: Status 404 returned error can't find the container with id 286e311f987d344ba7b01214f5cdde355f74bb75d799de3cc2f70b4e2dcb8ef4 Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.668829 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6977f5dffd-4k4rv"] Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.955131 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.962229 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a6468c22-4086-4884-a045-9f77a1b459d6","Type":"ContainerStarted","Data":"286e311f987d344ba7b01214f5cdde355f74bb75d799de3cc2f70b4e2dcb8ef4"} Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.970051 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d4477b597-fvt5b" event={"ID":"0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9","Type":"ContainerStarted","Data":"4b9a8f9f33a4cd65be8f2eb8412d128e071caac91712560000c5a661dbc335ab"} Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.993211 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-648455799b-c8jzs" event={"ID":"87998a25-3079-49a0-93da-d4326ed0ccc3","Type":"ContainerStarted","Data":"0f76723da968bea943343df65bc678741b6f6246c8445785a15609d6275aa9f5"} Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.993294 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-648455799b-c8jzs" event={"ID":"87998a25-3079-49a0-93da-d4326ed0ccc3","Type":"ContainerStarted","Data":"e2d13e96e46c0685595e0cad1aec298638e547e6c44d01e363b4df462a08216f"} Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.994233 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-648455799b-c8jzs" Oct 01 12:53:38 crc kubenswrapper[4727]: I1001 12:53:38.994386 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-648455799b-c8jzs" Oct 01 12:53:39 crc kubenswrapper[4727]: I1001 12:53:39.001446 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c46b9b5c6-gmpgq" event={"ID":"122ed690-1a2a-4989-ae98-c9009df8bb95","Type":"ContainerStarted","Data":"bd961045236b675ca8c307b9d082ebfbdc384d5d97089a4b4d0fd780334bfbc7"} Oct 01 12:53:39 crc kubenswrapper[4727]: I1001 12:53:39.001511 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c46b9b5c6-gmpgq" event={"ID":"122ed690-1a2a-4989-ae98-c9009df8bb95","Type":"ContainerStarted","Data":"1f831069dc336e2953768113f696050f9d1eb0cc80fd6c5e503abd3bcd9e0f8b"} Oct 01 12:53:39 crc kubenswrapper[4727]: I1001 12:53:39.008564 4727 generic.go:334] "Generic (PLEG): container finished" podID="17dd087d-9162-4d17-84fb-49bbcb2c542e" containerID="654162fd7d0169b4404daeedafcbaf7ed4dc9a4e2f955bb2228cbf20cfd54fa9" exitCode=0 Oct 01 12:53:39 crc kubenswrapper[4727]: I1001 12:53:39.008733 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" event={"ID":"17dd087d-9162-4d17-84fb-49bbcb2c542e","Type":"ContainerDied","Data":"654162fd7d0169b4404daeedafcbaf7ed4dc9a4e2f955bb2228cbf20cfd54fa9"} Oct 01 12:53:39 crc kubenswrapper[4727]: I1001 12:53:39.008772 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" event={"ID":"17dd087d-9162-4d17-84fb-49bbcb2c542e","Type":"ContainerStarted","Data":"1e72a2a03f34ebf7fc27501c99ee3e188fc1668af4a9adfe8d5ecf4f5c5510fc"} Oct 01 12:53:39 crc kubenswrapper[4727]: I1001 12:53:39.013522 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6977f5dffd-4k4rv" event={"ID":"81bb6eec-986d-4589-b007-dc88fcf1832b","Type":"ContainerStarted","Data":"6ecccc058e4e6e1121c34cb30e661612be735ce959e56007727372e100c41f06"} Oct 01 12:53:39 crc kubenswrapper[4727]: I1001 12:53:39.017131 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7485b6955d-qw7r5" event={"ID":"14e20fde-b925-4ef7-b5f8-4b6a50544990","Type":"ContainerStarted","Data":"0254b7cd5b8c12e88530b550bb86ad8f31d19245f88a011058122b0c8a19411e"} Oct 01 12:53:39 crc kubenswrapper[4727]: I1001 12:53:39.027788 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-648455799b-c8jzs" podStartSLOduration=6.027759684 podStartE2EDuration="6.027759684s" podCreationTimestamp="2025-10-01 12:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:53:39.015642336 +0000 UTC m=+997.336997183" watchObservedRunningTime="2025-10-01 12:53:39.027759684 +0000 UTC m=+997.349114531" Oct 01 12:53:40 crc kubenswrapper[4727]: I1001 12:53:40.059121 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6977f5dffd-4k4rv" event={"ID":"81bb6eec-986d-4589-b007-dc88fcf1832b","Type":"ContainerStarted","Data":"20122ecf6ee989ca388a95e43500ac5b5e7053c08d778881db952948edbe5b9c"} Oct 01 12:53:40 crc kubenswrapper[4727]: I1001 12:53:40.059934 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6977f5dffd-4k4rv" event={"ID":"81bb6eec-986d-4589-b007-dc88fcf1832b","Type":"ContainerStarted","Data":"2bcbdcfccde9fed1d06599016f8ad2dfa85377b138124e254f43ef890f99fb49"} Oct 01 12:53:40 crc kubenswrapper[4727]: I1001 12:53:40.090905 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a6468c22-4086-4884-a045-9f77a1b459d6","Type":"ContainerStarted","Data":"b375ef74f3167b9d0335cc3f5c76f2d53252b723c648000543830f8c8ed1c0c1"} Oct 01 12:53:40 crc kubenswrapper[4727]: I1001 12:53:40.100041 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ff4391de-b5d6-4014-961f-a00f0a8ec3c6","Type":"ContainerStarted","Data":"1315cd2f0f3b85753d1dec76af163179617a09853538f37b14083755a165dd3b"} Oct 01 12:53:40 crc kubenswrapper[4727]: I1001 12:53:40.110800 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c46b9b5c6-gmpgq" event={"ID":"122ed690-1a2a-4989-ae98-c9009df8bb95","Type":"ContainerStarted","Data":"b289791de95da7dbce6b9e1e53bd187a5645efd663793a05a2dd9a6cdb52951d"} Oct 01 12:53:40 crc kubenswrapper[4727]: I1001 12:53:40.111153 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c46b9b5c6-gmpgq" Oct 01 12:53:40 crc kubenswrapper[4727]: I1001 12:53:40.111270 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c46b9b5c6-gmpgq" Oct 01 12:53:40 crc kubenswrapper[4727]: I1001 12:53:40.129222 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" event={"ID":"17dd087d-9162-4d17-84fb-49bbcb2c542e","Type":"ContainerStarted","Data":"0dd2e0358359e27c18cb511307a5dcb144cee348103f4b04daa1e2e01f92af05"} Oct 01 12:53:40 crc kubenswrapper[4727]: I1001 12:53:40.129290 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" Oct 01 12:53:40 crc kubenswrapper[4727]: I1001 12:53:40.153130 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7c46b9b5c6-gmpgq" podStartSLOduration=5.153099442 podStartE2EDuration="5.153099442s" podCreationTimestamp="2025-10-01 12:53:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:53:40.139491366 +0000 UTC m=+998.460846213" watchObservedRunningTime="2025-10-01 12:53:40.153099442 +0000 UTC m=+998.474454269" Oct 01 12:53:40 crc kubenswrapper[4727]: I1001 12:53:40.182982 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" podStartSLOduration=5.182946273 podStartE2EDuration="5.182946273s" podCreationTimestamp="2025-10-01 12:53:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:53:40.179714693 +0000 UTC m=+998.501069550" watchObservedRunningTime="2025-10-01 12:53:40.182946273 +0000 UTC m=+998.504301110" Oct 01 12:53:41 crc kubenswrapper[4727]: I1001 12:53:41.161386 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6977f5dffd-4k4rv" podStartSLOduration=4.16136967 podStartE2EDuration="4.16136967s" podCreationTimestamp="2025-10-01 12:53:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:53:41.152514373 +0000 UTC m=+999.473869240" watchObservedRunningTime="2025-10-01 12:53:41.16136967 +0000 UTC m=+999.482724507" Oct 01 12:53:42 crc kubenswrapper[4727]: I1001 12:53:42.154442 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a6468c22-4086-4884-a045-9f77a1b459d6","Type":"ContainerStarted","Data":"d774ec4522fae7623c4cece95c3c9a034f1d2510cb6b544b8f0c9f36ced7acdc"} Oct 01 12:53:42 crc kubenswrapper[4727]: I1001 12:53:42.162264 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ff4391de-b5d6-4014-961f-a00f0a8ec3c6","Type":"ContainerStarted","Data":"558b23bc2226bc21814dd57165cf8033e45ef38cf572e228bb1d7acd22b6c525"} Oct 01 12:53:42 crc kubenswrapper[4727]: I1001 12:53:42.398579 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=10.39856408 podStartE2EDuration="10.39856408s" podCreationTimestamp="2025-10-01 12:53:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:53:42.180811038 +0000 UTC m=+1000.502165875" watchObservedRunningTime="2025-10-01 12:53:42.39856408 +0000 UTC m=+1000.719918917" Oct 01 12:53:43 crc kubenswrapper[4727]: I1001 12:53:43.164954 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6977f5dffd-4k4rv" Oct 01 12:53:43 crc kubenswrapper[4727]: I1001 12:53:43.165508 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6977f5dffd-4k4rv" Oct 01 12:53:43 crc kubenswrapper[4727]: I1001 12:53:43.173480 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ff4391de-b5d6-4014-961f-a00f0a8ec3c6","Type":"ContainerStarted","Data":"61f027a0eafca855e2f7f638c146097f7d31adbceb27f575cd21de41b6ca23fa"} Oct 01 12:53:43 crc kubenswrapper[4727]: I1001 12:53:43.204363 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.204346893 podStartE2EDuration="6.204346893s" podCreationTimestamp="2025-10-01 12:53:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:53:43.194470984 +0000 UTC m=+1001.515825821" watchObservedRunningTime="2025-10-01 12:53:43.204346893 +0000 UTC m=+1001.525701730" Oct 01 12:53:43 crc kubenswrapper[4727]: I1001 12:53:43.531431 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 01 12:53:43 crc kubenswrapper[4727]: I1001 12:53:43.531667 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 01 12:53:43 crc kubenswrapper[4727]: I1001 12:53:43.559365 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 01 12:53:43 crc kubenswrapper[4727]: I1001 12:53:43.585308 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 01 12:53:44 crc kubenswrapper[4727]: I1001 12:53:44.207017 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 01 12:53:44 crc kubenswrapper[4727]: I1001 12:53:44.207557 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 01 12:53:45 crc kubenswrapper[4727]: I1001 12:53:45.602979 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" Oct 01 12:53:45 crc kubenswrapper[4727]: I1001 12:53:45.682431 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-4jv4q"] Oct 01 12:53:45 crc kubenswrapper[4727]: I1001 12:53:45.682713 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" podUID="66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd" containerName="dnsmasq-dns" containerID="cri-o://86df7a77f4c956945ec38fb19ee67590cd65a17a7009d89bde5b6c75861e3052" gracePeriod=10 Oct 01 12:53:46 crc kubenswrapper[4727]: I1001 12:53:46.229988 4727 generic.go:334] "Generic (PLEG): container finished" podID="66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd" containerID="86df7a77f4c956945ec38fb19ee67590cd65a17a7009d89bde5b6c75861e3052" exitCode=0 Oct 01 12:53:46 crc kubenswrapper[4727]: I1001 12:53:46.230048 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" event={"ID":"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd","Type":"ContainerDied","Data":"86df7a77f4c956945ec38fb19ee67590cd65a17a7009d89bde5b6c75861e3052"} Oct 01 12:53:46 crc kubenswrapper[4727]: I1001 12:53:46.307357 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 01 12:53:48 crc kubenswrapper[4727]: I1001 12:53:48.077379 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7c46b9b5c6-gmpgq" Oct 01 12:53:48 crc kubenswrapper[4727]: I1001 12:53:48.323587 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 01 12:53:48 crc kubenswrapper[4727]: I1001 12:53:48.323633 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 01 12:53:48 crc kubenswrapper[4727]: I1001 12:53:48.327673 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7c46b9b5c6-gmpgq" Oct 01 12:53:48 crc kubenswrapper[4727]: I1001 12:53:48.419576 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 01 12:53:48 crc kubenswrapper[4727]: I1001 12:53:48.421583 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 01 12:53:48 crc kubenswrapper[4727]: I1001 12:53:48.996109 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 01 12:53:49 crc kubenswrapper[4727]: I1001 12:53:49.267014 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 12:53:49 crc kubenswrapper[4727]: I1001 12:53:49.267060 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 12:53:49 crc kubenswrapper[4727]: I1001 12:53:49.842413 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" Oct 01 12:53:49 crc kubenswrapper[4727]: I1001 12:53:49.930939 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-ovsdbserver-nb\") pod \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\" (UID: \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\") " Oct 01 12:53:49 crc kubenswrapper[4727]: I1001 12:53:49.931335 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-dns-svc\") pod \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\" (UID: \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\") " Oct 01 12:53:49 crc kubenswrapper[4727]: I1001 12:53:49.931383 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-dns-swift-storage-0\") pod \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\" (UID: \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\") " Oct 01 12:53:49 crc kubenswrapper[4727]: I1001 12:53:49.931458 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5766l\" (UniqueName: \"kubernetes.io/projected/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-kube-api-access-5766l\") pod \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\" (UID: \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\") " Oct 01 12:53:49 crc kubenswrapper[4727]: I1001 12:53:49.931475 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-config\") pod \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\" (UID: \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\") " Oct 01 12:53:49 crc kubenswrapper[4727]: I1001 12:53:49.931520 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-ovsdbserver-sb\") pod \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\" (UID: \"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd\") " Oct 01 12:53:49 crc kubenswrapper[4727]: I1001 12:53:49.978499 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-kube-api-access-5766l" (OuterVolumeSpecName: "kube-api-access-5766l") pod "66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd" (UID: "66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd"). InnerVolumeSpecName "kube-api-access-5766l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:50 crc kubenswrapper[4727]: I1001 12:53:50.033475 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5766l\" (UniqueName: \"kubernetes.io/projected/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-kube-api-access-5766l\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:50 crc kubenswrapper[4727]: I1001 12:53:50.178051 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd" (UID: "66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:50 crc kubenswrapper[4727]: I1001 12:53:50.187471 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6977f5dffd-4k4rv" Oct 01 12:53:50 crc kubenswrapper[4727]: I1001 12:53:50.238439 4727 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:50 crc kubenswrapper[4727]: I1001 12:53:50.270547 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd" (UID: "66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:50 crc kubenswrapper[4727]: I1001 12:53:50.284819 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd" (UID: "66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:50 crc kubenswrapper[4727]: I1001 12:53:50.292424 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd" (UID: "66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:50 crc kubenswrapper[4727]: I1001 12:53:50.298192 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-config" (OuterVolumeSpecName: "config") pod "66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd" (UID: "66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:50 crc kubenswrapper[4727]: I1001 12:53:50.309426 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" event={"ID":"66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd","Type":"ContainerDied","Data":"87195537dc5d93a3a59c7d63569e85d8a23fef1fbace7c73679ed54f519f1aaa"} Oct 01 12:53:50 crc kubenswrapper[4727]: I1001 12:53:50.309485 4727 scope.go:117] "RemoveContainer" containerID="86df7a77f4c956945ec38fb19ee67590cd65a17a7009d89bde5b6c75861e3052" Oct 01 12:53:50 crc kubenswrapper[4727]: I1001 12:53:50.309638 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" Oct 01 12:53:50 crc kubenswrapper[4727]: I1001 12:53:50.321905 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d4477b597-fvt5b" event={"ID":"0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9","Type":"ContainerStarted","Data":"2e326b33e2825065f42faf2743847f1922db7da8499e422ebdca63067f025df7"} Oct 01 12:53:50 crc kubenswrapper[4727]: I1001 12:53:50.340202 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:50 crc kubenswrapper[4727]: I1001 12:53:50.340243 4727 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:50 crc kubenswrapper[4727]: I1001 12:53:50.340257 4727 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:50 crc kubenswrapper[4727]: I1001 12:53:50.340271 4727 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:50 crc kubenswrapper[4727]: I1001 12:53:50.385988 4727 scope.go:117] "RemoveContainer" containerID="9d79c86d400af50f8cb425e8d0b71ec2bd4d00a0215e5caa9455a201f00afcff" Oct 01 12:53:50 crc kubenswrapper[4727]: I1001 12:53:50.407801 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-4jv4q"] Oct 01 12:53:50 crc kubenswrapper[4727]: I1001 12:53:50.407830 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-4jv4q"] Oct 01 12:53:50 crc kubenswrapper[4727]: E1001 12:53:50.474203 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"sg-core\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="25c1560a-bd40-490a-8d86-a71b9a34b7ea" Oct 01 12:53:50 crc kubenswrapper[4727]: I1001 12:53:50.642127 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6977f5dffd-4k4rv" Oct 01 12:53:50 crc kubenswrapper[4727]: I1001 12:53:50.704479 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7c46b9b5c6-gmpgq"] Oct 01 12:53:50 crc kubenswrapper[4727]: I1001 12:53:50.704781 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7c46b9b5c6-gmpgq" podUID="122ed690-1a2a-4989-ae98-c9009df8bb95" containerName="barbican-api-log" containerID="cri-o://bd961045236b675ca8c307b9d082ebfbdc384d5d97089a4b4d0fd780334bfbc7" gracePeriod=30 Oct 01 12:53:50 crc kubenswrapper[4727]: I1001 12:53:50.704942 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7c46b9b5c6-gmpgq" podUID="122ed690-1a2a-4989-ae98-c9009df8bb95" containerName="barbican-api" containerID="cri-o://b289791de95da7dbce6b9e1e53bd187a5645efd663793a05a2dd9a6cdb52951d" gracePeriod=30 Oct 01 12:53:51 crc kubenswrapper[4727]: I1001 12:53:51.392636 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7485b6955d-qw7r5" event={"ID":"14e20fde-b925-4ef7-b5f8-4b6a50544990","Type":"ContainerStarted","Data":"44afa5e94234d281ae993496357de7c9d553c126b817433237b55e5ded486ffc"} Oct 01 12:53:51 crc kubenswrapper[4727]: I1001 12:53:51.392957 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7485b6955d-qw7r5" event={"ID":"14e20fde-b925-4ef7-b5f8-4b6a50544990","Type":"ContainerStarted","Data":"fb70d4c8ae775594ce17f1eea6470bdc1abd623252ad5a9e5e9a14c81c38bac3"} Oct 01 12:53:51 crc kubenswrapper[4727]: I1001 12:53:51.407254 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d4477b597-fvt5b" event={"ID":"0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9","Type":"ContainerStarted","Data":"11226e61ad26264a4f010626753655f465e68877876d22deb462db7a13272597"} Oct 01 12:53:51 crc kubenswrapper[4727]: I1001 12:53:51.426453 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7485b6955d-qw7r5" podStartSLOduration=5.134160259 podStartE2EDuration="16.426432667s" podCreationTimestamp="2025-10-01 12:53:35 +0000 UTC" firstStartedPulling="2025-10-01 12:53:38.265411198 +0000 UTC m=+996.586766035" lastFinishedPulling="2025-10-01 12:53:49.557683606 +0000 UTC m=+1007.879038443" observedRunningTime="2025-10-01 12:53:51.417939092 +0000 UTC m=+1009.739293939" watchObservedRunningTime="2025-10-01 12:53:51.426432667 +0000 UTC m=+1009.747787514" Oct 01 12:53:51 crc kubenswrapper[4727]: I1001 12:53:51.442389 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25c1560a-bd40-490a-8d86-a71b9a34b7ea","Type":"ContainerStarted","Data":"c3d979a0586388baf9f3d1bb9a65762f8594c091bda209d5232182a6888e5004"} Oct 01 12:53:51 crc kubenswrapper[4727]: I1001 12:53:51.442784 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="25c1560a-bd40-490a-8d86-a71b9a34b7ea" containerName="ceilometer-central-agent" containerID="cri-o://28c31b69b1f606cc17277305288fa4f7b5f954268c25fa4fb1df23484b1f746a" gracePeriod=30 Oct 01 12:53:51 crc kubenswrapper[4727]: I1001 12:53:51.442986 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 12:53:51 crc kubenswrapper[4727]: I1001 12:53:51.443206 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="25c1560a-bd40-490a-8d86-a71b9a34b7ea" containerName="proxy-httpd" containerID="cri-o://c3d979a0586388baf9f3d1bb9a65762f8594c091bda209d5232182a6888e5004" gracePeriod=30 Oct 01 12:53:51 crc kubenswrapper[4727]: I1001 12:53:51.443359 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="25c1560a-bd40-490a-8d86-a71b9a34b7ea" containerName="ceilometer-notification-agent" containerID="cri-o://8155aac14dfd4155e42e2bf5e444b03a3c02e30d5e3e12e85b5999ab62237a25" gracePeriod=30 Oct 01 12:53:51 crc kubenswrapper[4727]: I1001 12:53:51.454351 4727 generic.go:334] "Generic (PLEG): container finished" podID="122ed690-1a2a-4989-ae98-c9009df8bb95" containerID="bd961045236b675ca8c307b9d082ebfbdc384d5d97089a4b4d0fd780334bfbc7" exitCode=143 Oct 01 12:53:51 crc kubenswrapper[4727]: I1001 12:53:51.454398 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c46b9b5c6-gmpgq" event={"ID":"122ed690-1a2a-4989-ae98-c9009df8bb95","Type":"ContainerDied","Data":"bd961045236b675ca8c307b9d082ebfbdc384d5d97089a4b4d0fd780334bfbc7"} Oct 01 12:53:51 crc kubenswrapper[4727]: I1001 12:53:51.454606 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7d4477b597-fvt5b" podStartSLOduration=5.151683597 podStartE2EDuration="16.454595337s" podCreationTimestamp="2025-10-01 12:53:35 +0000 UTC" firstStartedPulling="2025-10-01 12:53:38.25523695 +0000 UTC m=+996.576591787" lastFinishedPulling="2025-10-01 12:53:49.55814869 +0000 UTC m=+1007.879503527" observedRunningTime="2025-10-01 12:53:51.443223572 +0000 UTC m=+1009.764578409" watchObservedRunningTime="2025-10-01 12:53:51.454595337 +0000 UTC m=+1009.775950174" Oct 01 12:53:52 crc kubenswrapper[4727]: I1001 12:53:52.326316 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 01 12:53:52 crc kubenswrapper[4727]: I1001 12:53:52.326683 4727 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 12:53:52 crc kubenswrapper[4727]: I1001 12:53:52.385440 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd" path="/var/lib/kubelet/pods/66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd/volumes" Oct 01 12:53:52 crc kubenswrapper[4727]: I1001 12:53:52.464789 4727 generic.go:334] "Generic (PLEG): container finished" podID="25c1560a-bd40-490a-8d86-a71b9a34b7ea" containerID="c3d979a0586388baf9f3d1bb9a65762f8594c091bda209d5232182a6888e5004" exitCode=0 Oct 01 12:53:52 crc kubenswrapper[4727]: I1001 12:53:52.464820 4727 generic.go:334] "Generic (PLEG): container finished" podID="25c1560a-bd40-490a-8d86-a71b9a34b7ea" containerID="28c31b69b1f606cc17277305288fa4f7b5f954268c25fa4fb1df23484b1f746a" exitCode=0 Oct 01 12:53:52 crc kubenswrapper[4727]: I1001 12:53:52.464836 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25c1560a-bd40-490a-8d86-a71b9a34b7ea","Type":"ContainerDied","Data":"c3d979a0586388baf9f3d1bb9a65762f8594c091bda209d5232182a6888e5004"} Oct 01 12:53:52 crc kubenswrapper[4727]: I1001 12:53:52.464878 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25c1560a-bd40-490a-8d86-a71b9a34b7ea","Type":"ContainerDied","Data":"28c31b69b1f606cc17277305288fa4f7b5f954268c25fa4fb1df23484b1f746a"} Oct 01 12:53:52 crc kubenswrapper[4727]: I1001 12:53:52.467122 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-74wc5" event={"ID":"5746629a-ce5e-4404-8996-165034633b9e","Type":"ContainerStarted","Data":"031f6669cf8749f3fcf4677257b33ab960904b0aea2446f27aae497cffcf9f2b"} Oct 01 12:53:52 crc kubenswrapper[4727]: I1001 12:53:52.483058 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7f4d7ff84c-l27rl" Oct 01 12:53:52 crc kubenswrapper[4727]: I1001 12:53:52.496502 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-74wc5" podStartSLOduration=3.751705413 podStartE2EDuration="44.496487067s" podCreationTimestamp="2025-10-01 12:53:08 +0000 UTC" firstStartedPulling="2025-10-01 12:53:09.301352391 +0000 UTC m=+967.622707228" lastFinishedPulling="2025-10-01 12:53:50.046134055 +0000 UTC m=+1008.367488882" observedRunningTime="2025-10-01 12:53:52.491486761 +0000 UTC m=+1010.812841598" watchObservedRunningTime="2025-10-01 12:53:52.496487067 +0000 UTC m=+1010.817841904" Oct 01 12:53:52 crc kubenswrapper[4727]: I1001 12:53:52.946512 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 01 12:53:53 crc kubenswrapper[4727]: I1001 12:53:53.891625 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7c46b9b5c6-gmpgq" podUID="122ed690-1a2a-4989-ae98-c9009df8bb95" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": read tcp 10.217.0.2:57116->10.217.0.152:9311: read: connection reset by peer" Oct 01 12:53:53 crc kubenswrapper[4727]: I1001 12:53:53.891754 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7c46b9b5c6-gmpgq" podUID="122ed690-1a2a-4989-ae98-c9009df8bb95" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.152:9311/healthcheck\": read tcp 10.217.0.2:57120->10.217.0.152:9311: read: connection reset by peer" Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.029492 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8b5c85b87-4jv4q" podUID="66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: i/o timeout" Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.355384 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c46b9b5c6-gmpgq" Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.486234 4727 generic.go:334] "Generic (PLEG): container finished" podID="122ed690-1a2a-4989-ae98-c9009df8bb95" containerID="b289791de95da7dbce6b9e1e53bd187a5645efd663793a05a2dd9a6cdb52951d" exitCode=0 Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.486287 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c46b9b5c6-gmpgq" event={"ID":"122ed690-1a2a-4989-ae98-c9009df8bb95","Type":"ContainerDied","Data":"b289791de95da7dbce6b9e1e53bd187a5645efd663793a05a2dd9a6cdb52951d"} Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.486317 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c46b9b5c6-gmpgq" event={"ID":"122ed690-1a2a-4989-ae98-c9009df8bb95","Type":"ContainerDied","Data":"1f831069dc336e2953768113f696050f9d1eb0cc80fd6c5e503abd3bcd9e0f8b"} Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.486338 4727 scope.go:117] "RemoveContainer" containerID="b289791de95da7dbce6b9e1e53bd187a5645efd663793a05a2dd9a6cdb52951d" Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.486469 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c46b9b5c6-gmpgq" Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.508134 4727 scope.go:117] "RemoveContainer" containerID="bd961045236b675ca8c307b9d082ebfbdc384d5d97089a4b4d0fd780334bfbc7" Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.520938 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntg8m\" (UniqueName: \"kubernetes.io/projected/122ed690-1a2a-4989-ae98-c9009df8bb95-kube-api-access-ntg8m\") pod \"122ed690-1a2a-4989-ae98-c9009df8bb95\" (UID: \"122ed690-1a2a-4989-ae98-c9009df8bb95\") " Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.521022 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/122ed690-1a2a-4989-ae98-c9009df8bb95-logs\") pod \"122ed690-1a2a-4989-ae98-c9009df8bb95\" (UID: \"122ed690-1a2a-4989-ae98-c9009df8bb95\") " Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.521189 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/122ed690-1a2a-4989-ae98-c9009df8bb95-config-data-custom\") pod \"122ed690-1a2a-4989-ae98-c9009df8bb95\" (UID: \"122ed690-1a2a-4989-ae98-c9009df8bb95\") " Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.521224 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/122ed690-1a2a-4989-ae98-c9009df8bb95-config-data\") pod \"122ed690-1a2a-4989-ae98-c9009df8bb95\" (UID: \"122ed690-1a2a-4989-ae98-c9009df8bb95\") " Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.521261 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122ed690-1a2a-4989-ae98-c9009df8bb95-combined-ca-bundle\") pod \"122ed690-1a2a-4989-ae98-c9009df8bb95\" (UID: \"122ed690-1a2a-4989-ae98-c9009df8bb95\") " Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.521557 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/122ed690-1a2a-4989-ae98-c9009df8bb95-logs" (OuterVolumeSpecName: "logs") pod "122ed690-1a2a-4989-ae98-c9009df8bb95" (UID: "122ed690-1a2a-4989-ae98-c9009df8bb95"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.522247 4727 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/122ed690-1a2a-4989-ae98-c9009df8bb95-logs\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.529787 4727 scope.go:117] "RemoveContainer" containerID="b289791de95da7dbce6b9e1e53bd187a5645efd663793a05a2dd9a6cdb52951d" Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.530235 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/122ed690-1a2a-4989-ae98-c9009df8bb95-kube-api-access-ntg8m" (OuterVolumeSpecName: "kube-api-access-ntg8m") pod "122ed690-1a2a-4989-ae98-c9009df8bb95" (UID: "122ed690-1a2a-4989-ae98-c9009df8bb95"). InnerVolumeSpecName "kube-api-access-ntg8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.531234 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/122ed690-1a2a-4989-ae98-c9009df8bb95-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "122ed690-1a2a-4989-ae98-c9009df8bb95" (UID: "122ed690-1a2a-4989-ae98-c9009df8bb95"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:54 crc kubenswrapper[4727]: E1001 12:53:54.531245 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b289791de95da7dbce6b9e1e53bd187a5645efd663793a05a2dd9a6cdb52951d\": container with ID starting with b289791de95da7dbce6b9e1e53bd187a5645efd663793a05a2dd9a6cdb52951d not found: ID does not exist" containerID="b289791de95da7dbce6b9e1e53bd187a5645efd663793a05a2dd9a6cdb52951d" Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.531311 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b289791de95da7dbce6b9e1e53bd187a5645efd663793a05a2dd9a6cdb52951d"} err="failed to get container status \"b289791de95da7dbce6b9e1e53bd187a5645efd663793a05a2dd9a6cdb52951d\": rpc error: code = NotFound desc = could not find container \"b289791de95da7dbce6b9e1e53bd187a5645efd663793a05a2dd9a6cdb52951d\": container with ID starting with b289791de95da7dbce6b9e1e53bd187a5645efd663793a05a2dd9a6cdb52951d not found: ID does not exist" Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.531339 4727 scope.go:117] "RemoveContainer" containerID="bd961045236b675ca8c307b9d082ebfbdc384d5d97089a4b4d0fd780334bfbc7" Oct 01 12:53:54 crc kubenswrapper[4727]: E1001 12:53:54.531638 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd961045236b675ca8c307b9d082ebfbdc384d5d97089a4b4d0fd780334bfbc7\": container with ID starting with bd961045236b675ca8c307b9d082ebfbdc384d5d97089a4b4d0fd780334bfbc7 not found: ID does not exist" containerID="bd961045236b675ca8c307b9d082ebfbdc384d5d97089a4b4d0fd780334bfbc7" Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.531674 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd961045236b675ca8c307b9d082ebfbdc384d5d97089a4b4d0fd780334bfbc7"} err="failed to get container status \"bd961045236b675ca8c307b9d082ebfbdc384d5d97089a4b4d0fd780334bfbc7\": rpc error: code = NotFound desc = could not find container \"bd961045236b675ca8c307b9d082ebfbdc384d5d97089a4b4d0fd780334bfbc7\": container with ID starting with bd961045236b675ca8c307b9d082ebfbdc384d5d97089a4b4d0fd780334bfbc7 not found: ID does not exist" Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.579803 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/122ed690-1a2a-4989-ae98-c9009df8bb95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "122ed690-1a2a-4989-ae98-c9009df8bb95" (UID: "122ed690-1a2a-4989-ae98-c9009df8bb95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.595793 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/122ed690-1a2a-4989-ae98-c9009df8bb95-config-data" (OuterVolumeSpecName: "config-data") pod "122ed690-1a2a-4989-ae98-c9009df8bb95" (UID: "122ed690-1a2a-4989-ae98-c9009df8bb95"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.623680 4727 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/122ed690-1a2a-4989-ae98-c9009df8bb95-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.624374 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/122ed690-1a2a-4989-ae98-c9009df8bb95-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.624423 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122ed690-1a2a-4989-ae98-c9009df8bb95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.624439 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntg8m\" (UniqueName: \"kubernetes.io/projected/122ed690-1a2a-4989-ae98-c9009df8bb95-kube-api-access-ntg8m\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.817422 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7c46b9b5c6-gmpgq"] Oct 01 12:53:54 crc kubenswrapper[4727]: I1001 12:53:54.825244 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7c46b9b5c6-gmpgq"] Oct 01 12:53:55 crc kubenswrapper[4727]: I1001 12:53:55.505597 4727 generic.go:334] "Generic (PLEG): container finished" podID="25c1560a-bd40-490a-8d86-a71b9a34b7ea" containerID="8155aac14dfd4155e42e2bf5e444b03a3c02e30d5e3e12e85b5999ab62237a25" exitCode=0 Oct 01 12:53:55 crc kubenswrapper[4727]: I1001 12:53:55.505664 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25c1560a-bd40-490a-8d86-a71b9a34b7ea","Type":"ContainerDied","Data":"8155aac14dfd4155e42e2bf5e444b03a3c02e30d5e3e12e85b5999ab62237a25"} Oct 01 12:53:55 crc kubenswrapper[4727]: I1001 12:53:55.509217 4727 generic.go:334] "Generic (PLEG): container finished" podID="d4da191c-6509-4bb7-b9b2-344f8224ae58" containerID="9df9fca93de18a593d6c01b4c56a4f0c1ab4ae6be3d3bb88e43ece750ad4d56c" exitCode=0 Oct 01 12:53:55 crc kubenswrapper[4727]: I1001 12:53:55.509245 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nkrx8" event={"ID":"d4da191c-6509-4bb7-b9b2-344f8224ae58","Type":"ContainerDied","Data":"9df9fca93de18a593d6c01b4c56a4f0c1ab4ae6be3d3bb88e43ece750ad4d56c"} Oct 01 12:53:55 crc kubenswrapper[4727]: I1001 12:53:55.621442 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:53:55 crc kubenswrapper[4727]: I1001 12:53:55.747549 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25c1560a-bd40-490a-8d86-a71b9a34b7ea-run-httpd\") pod \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " Oct 01 12:53:55 crc kubenswrapper[4727]: I1001 12:53:55.747609 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25c1560a-bd40-490a-8d86-a71b9a34b7ea-log-httpd\") pod \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " Oct 01 12:53:55 crc kubenswrapper[4727]: I1001 12:53:55.747663 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25c1560a-bd40-490a-8d86-a71b9a34b7ea-sg-core-conf-yaml\") pod \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " Oct 01 12:53:55 crc kubenswrapper[4727]: I1001 12:53:55.747687 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25c1560a-bd40-490a-8d86-a71b9a34b7ea-scripts\") pod \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " Oct 01 12:53:55 crc kubenswrapper[4727]: I1001 12:53:55.747742 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25c1560a-bd40-490a-8d86-a71b9a34b7ea-config-data\") pod \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " Oct 01 12:53:55 crc kubenswrapper[4727]: I1001 12:53:55.747846 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c1560a-bd40-490a-8d86-a71b9a34b7ea-combined-ca-bundle\") pod \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " Oct 01 12:53:55 crc kubenswrapper[4727]: I1001 12:53:55.747875 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxhcc\" (UniqueName: \"kubernetes.io/projected/25c1560a-bd40-490a-8d86-a71b9a34b7ea-kube-api-access-kxhcc\") pod \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\" (UID: \"25c1560a-bd40-490a-8d86-a71b9a34b7ea\") " Oct 01 12:53:55 crc kubenswrapper[4727]: I1001 12:53:55.748284 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25c1560a-bd40-490a-8d86-a71b9a34b7ea-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "25c1560a-bd40-490a-8d86-a71b9a34b7ea" (UID: "25c1560a-bd40-490a-8d86-a71b9a34b7ea"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:53:55 crc kubenswrapper[4727]: I1001 12:53:55.749230 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25c1560a-bd40-490a-8d86-a71b9a34b7ea-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "25c1560a-bd40-490a-8d86-a71b9a34b7ea" (UID: "25c1560a-bd40-490a-8d86-a71b9a34b7ea"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:53:55 crc kubenswrapper[4727]: I1001 12:53:55.756417 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c1560a-bd40-490a-8d86-a71b9a34b7ea-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "25c1560a-bd40-490a-8d86-a71b9a34b7ea" (UID: "25c1560a-bd40-490a-8d86-a71b9a34b7ea"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:55 crc kubenswrapper[4727]: I1001 12:53:55.756648 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25c1560a-bd40-490a-8d86-a71b9a34b7ea-kube-api-access-kxhcc" (OuterVolumeSpecName: "kube-api-access-kxhcc") pod "25c1560a-bd40-490a-8d86-a71b9a34b7ea" (UID: "25c1560a-bd40-490a-8d86-a71b9a34b7ea"). InnerVolumeSpecName "kube-api-access-kxhcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:55 crc kubenswrapper[4727]: I1001 12:53:55.760784 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c1560a-bd40-490a-8d86-a71b9a34b7ea-scripts" (OuterVolumeSpecName: "scripts") pod "25c1560a-bd40-490a-8d86-a71b9a34b7ea" (UID: "25c1560a-bd40-490a-8d86-a71b9a34b7ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:55 crc kubenswrapper[4727]: I1001 12:53:55.830228 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c1560a-bd40-490a-8d86-a71b9a34b7ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25c1560a-bd40-490a-8d86-a71b9a34b7ea" (UID: "25c1560a-bd40-490a-8d86-a71b9a34b7ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:55 crc kubenswrapper[4727]: I1001 12:53:55.850011 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c1560a-bd40-490a-8d86-a71b9a34b7ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:55 crc kubenswrapper[4727]: I1001 12:53:55.850044 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxhcc\" (UniqueName: \"kubernetes.io/projected/25c1560a-bd40-490a-8d86-a71b9a34b7ea-kube-api-access-kxhcc\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:55 crc kubenswrapper[4727]: I1001 12:53:55.850057 4727 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25c1560a-bd40-490a-8d86-a71b9a34b7ea-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:55 crc kubenswrapper[4727]: I1001 12:53:55.850068 4727 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25c1560a-bd40-490a-8d86-a71b9a34b7ea-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:55 crc kubenswrapper[4727]: I1001 12:53:55.850076 4727 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25c1560a-bd40-490a-8d86-a71b9a34b7ea-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:55 crc kubenswrapper[4727]: I1001 12:53:55.850084 4727 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25c1560a-bd40-490a-8d86-a71b9a34b7ea-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:55 crc kubenswrapper[4727]: I1001 12:53:55.852712 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c1560a-bd40-490a-8d86-a71b9a34b7ea-config-data" (OuterVolumeSpecName: "config-data") pod "25c1560a-bd40-490a-8d86-a71b9a34b7ea" (UID: "25c1560a-bd40-490a-8d86-a71b9a34b7ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:55 crc kubenswrapper[4727]: I1001 12:53:55.951466 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25c1560a-bd40-490a-8d86-a71b9a34b7ea-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.384274 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="122ed690-1a2a-4989-ae98-c9009df8bb95" path="/var/lib/kubelet/pods/122ed690-1a2a-4989-ae98-c9009df8bb95/volumes" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.519915 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25c1560a-bd40-490a-8d86-a71b9a34b7ea","Type":"ContainerDied","Data":"eec082c363ea6843e1a95bccc3760e56fb185dee9511a8ea8ead0d6605585cfe"} Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.519957 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.519986 4727 scope.go:117] "RemoveContainer" containerID="c3d979a0586388baf9f3d1bb9a65762f8594c091bda209d5232182a6888e5004" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.543306 4727 scope.go:117] "RemoveContainer" containerID="8155aac14dfd4155e42e2bf5e444b03a3c02e30d5e3e12e85b5999ab62237a25" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.586596 4727 scope.go:117] "RemoveContainer" containerID="28c31b69b1f606cc17277305288fa4f7b5f954268c25fa4fb1df23484b1f746a" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.591260 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.602133 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.612817 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:53:56 crc kubenswrapper[4727]: E1001 12:53:56.623316 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd" containerName="dnsmasq-dns" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.623357 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd" containerName="dnsmasq-dns" Oct 01 12:53:56 crc kubenswrapper[4727]: E1001 12:53:56.623382 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25c1560a-bd40-490a-8d86-a71b9a34b7ea" containerName="ceilometer-central-agent" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.623388 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="25c1560a-bd40-490a-8d86-a71b9a34b7ea" containerName="ceilometer-central-agent" Oct 01 12:53:56 crc kubenswrapper[4727]: E1001 12:53:56.623398 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd" containerName="init" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.623404 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd" containerName="init" Oct 01 12:53:56 crc kubenswrapper[4727]: E1001 12:53:56.623415 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25c1560a-bd40-490a-8d86-a71b9a34b7ea" containerName="ceilometer-notification-agent" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.623421 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="25c1560a-bd40-490a-8d86-a71b9a34b7ea" containerName="ceilometer-notification-agent" Oct 01 12:53:56 crc kubenswrapper[4727]: E1001 12:53:56.623444 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122ed690-1a2a-4989-ae98-c9009df8bb95" containerName="barbican-api" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.623450 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="122ed690-1a2a-4989-ae98-c9009df8bb95" containerName="barbican-api" Oct 01 12:53:56 crc kubenswrapper[4727]: E1001 12:53:56.623470 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122ed690-1a2a-4989-ae98-c9009df8bb95" containerName="barbican-api-log" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.623476 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="122ed690-1a2a-4989-ae98-c9009df8bb95" containerName="barbican-api-log" Oct 01 12:53:56 crc kubenswrapper[4727]: E1001 12:53:56.623495 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25c1560a-bd40-490a-8d86-a71b9a34b7ea" containerName="proxy-httpd" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.623501 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="25c1560a-bd40-490a-8d86-a71b9a34b7ea" containerName="proxy-httpd" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.623721 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="66ac2a2d-9be3-4f1e-b2dc-9507cb35efdd" containerName="dnsmasq-dns" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.623740 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="122ed690-1a2a-4989-ae98-c9009df8bb95" containerName="barbican-api" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.623751 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="25c1560a-bd40-490a-8d86-a71b9a34b7ea" containerName="proxy-httpd" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.623763 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="25c1560a-bd40-490a-8d86-a71b9a34b7ea" containerName="ceilometer-notification-agent" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.623772 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="25c1560a-bd40-490a-8d86-a71b9a34b7ea" containerName="ceilometer-central-agent" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.623784 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="122ed690-1a2a-4989-ae98-c9009df8bb95" containerName="barbican-api-log" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.625372 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.636025 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.636371 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.637827 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.775319 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/810e3b5f-3968-469f-8a31-a5426587d78a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " pod="openstack/ceilometer-0" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.775390 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5qlq\" (UniqueName: \"kubernetes.io/projected/810e3b5f-3968-469f-8a31-a5426587d78a-kube-api-access-m5qlq\") pod \"ceilometer-0\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " pod="openstack/ceilometer-0" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.775421 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/810e3b5f-3968-469f-8a31-a5426587d78a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " pod="openstack/ceilometer-0" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.775501 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/810e3b5f-3968-469f-8a31-a5426587d78a-config-data\") pod \"ceilometer-0\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " pod="openstack/ceilometer-0" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.775528 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/810e3b5f-3968-469f-8a31-a5426587d78a-scripts\") pod \"ceilometer-0\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " pod="openstack/ceilometer-0" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.775651 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/810e3b5f-3968-469f-8a31-a5426587d78a-log-httpd\") pod \"ceilometer-0\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " pod="openstack/ceilometer-0" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.775682 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/810e3b5f-3968-469f-8a31-a5426587d78a-run-httpd\") pod \"ceilometer-0\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " pod="openstack/ceilometer-0" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.877917 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/810e3b5f-3968-469f-8a31-a5426587d78a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " pod="openstack/ceilometer-0" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.878588 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5qlq\" (UniqueName: \"kubernetes.io/projected/810e3b5f-3968-469f-8a31-a5426587d78a-kube-api-access-m5qlq\") pod \"ceilometer-0\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " pod="openstack/ceilometer-0" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.878620 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/810e3b5f-3968-469f-8a31-a5426587d78a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " pod="openstack/ceilometer-0" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.878654 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/810e3b5f-3968-469f-8a31-a5426587d78a-config-data\") pod \"ceilometer-0\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " pod="openstack/ceilometer-0" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.878685 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/810e3b5f-3968-469f-8a31-a5426587d78a-scripts\") pod \"ceilometer-0\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " pod="openstack/ceilometer-0" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.878743 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/810e3b5f-3968-469f-8a31-a5426587d78a-log-httpd\") pod \"ceilometer-0\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " pod="openstack/ceilometer-0" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.878765 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/810e3b5f-3968-469f-8a31-a5426587d78a-run-httpd\") pod \"ceilometer-0\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " pod="openstack/ceilometer-0" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.879965 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/810e3b5f-3968-469f-8a31-a5426587d78a-run-httpd\") pod \"ceilometer-0\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " pod="openstack/ceilometer-0" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.880635 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/810e3b5f-3968-469f-8a31-a5426587d78a-log-httpd\") pod \"ceilometer-0\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " pod="openstack/ceilometer-0" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.888677 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/810e3b5f-3968-469f-8a31-a5426587d78a-config-data\") pod \"ceilometer-0\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " pod="openstack/ceilometer-0" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.892892 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/810e3b5f-3968-469f-8a31-a5426587d78a-scripts\") pod \"ceilometer-0\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " pod="openstack/ceilometer-0" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.895071 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/810e3b5f-3968-469f-8a31-a5426587d78a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " pod="openstack/ceilometer-0" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.896354 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/810e3b5f-3968-469f-8a31-a5426587d78a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " pod="openstack/ceilometer-0" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.902797 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5qlq\" (UniqueName: \"kubernetes.io/projected/810e3b5f-3968-469f-8a31-a5426587d78a-kube-api-access-m5qlq\") pod \"ceilometer-0\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " pod="openstack/ceilometer-0" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.952125 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:53:56 crc kubenswrapper[4727]: I1001 12:53:56.958024 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nkrx8" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.082109 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d4da191c-6509-4bb7-b9b2-344f8224ae58-config\") pod \"d4da191c-6509-4bb7-b9b2-344f8224ae58\" (UID: \"d4da191c-6509-4bb7-b9b2-344f8224ae58\") " Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.082636 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltttk\" (UniqueName: \"kubernetes.io/projected/d4da191c-6509-4bb7-b9b2-344f8224ae58-kube-api-access-ltttk\") pod \"d4da191c-6509-4bb7-b9b2-344f8224ae58\" (UID: \"d4da191c-6509-4bb7-b9b2-344f8224ae58\") " Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.082729 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4da191c-6509-4bb7-b9b2-344f8224ae58-combined-ca-bundle\") pod \"d4da191c-6509-4bb7-b9b2-344f8224ae58\" (UID: \"d4da191c-6509-4bb7-b9b2-344f8224ae58\") " Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.088833 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4da191c-6509-4bb7-b9b2-344f8224ae58-kube-api-access-ltttk" (OuterVolumeSpecName: "kube-api-access-ltttk") pod "d4da191c-6509-4bb7-b9b2-344f8224ae58" (UID: "d4da191c-6509-4bb7-b9b2-344f8224ae58"). InnerVolumeSpecName "kube-api-access-ltttk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.142165 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4da191c-6509-4bb7-b9b2-344f8224ae58-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4da191c-6509-4bb7-b9b2-344f8224ae58" (UID: "d4da191c-6509-4bb7-b9b2-344f8224ae58"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.181323 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4da191c-6509-4bb7-b9b2-344f8224ae58-config" (OuterVolumeSpecName: "config") pod "d4da191c-6509-4bb7-b9b2-344f8224ae58" (UID: "d4da191c-6509-4bb7-b9b2-344f8224ae58"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.189237 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltttk\" (UniqueName: \"kubernetes.io/projected/d4da191c-6509-4bb7-b9b2-344f8224ae58-kube-api-access-ltttk\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.189318 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4da191c-6509-4bb7-b9b2-344f8224ae58-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.189331 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d4da191c-6509-4bb7-b9b2-344f8224ae58-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.194839 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 01 12:53:57 crc kubenswrapper[4727]: E1001 12:53:57.195718 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4da191c-6509-4bb7-b9b2-344f8224ae58" containerName="neutron-db-sync" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.195738 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4da191c-6509-4bb7-b9b2-344f8224ae58" containerName="neutron-db-sync" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.195972 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4da191c-6509-4bb7-b9b2-344f8224ae58" containerName="neutron-db-sync" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.196656 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.199875 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-4nxs8" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.200187 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.200394 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.203791 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.393220 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fab508b8-e8d9-40ce-999a-366d55147a8d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fab508b8-e8d9-40ce-999a-366d55147a8d\") " pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.393418 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fab508b8-e8d9-40ce-999a-366d55147a8d-openstack-config\") pod \"openstackclient\" (UID: \"fab508b8-e8d9-40ce-999a-366d55147a8d\") " pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.393469 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4v68\" (UniqueName: \"kubernetes.io/projected/fab508b8-e8d9-40ce-999a-366d55147a8d-kube-api-access-z4v68\") pod \"openstackclient\" (UID: \"fab508b8-e8d9-40ce-999a-366d55147a8d\") " pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.393499 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fab508b8-e8d9-40ce-999a-366d55147a8d-openstack-config-secret\") pod \"openstackclient\" (UID: \"fab508b8-e8d9-40ce-999a-366d55147a8d\") " pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.426493 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 01 12:53:57 crc kubenswrapper[4727]: E1001 12:53:57.432250 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-z4v68 openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="fab508b8-e8d9-40ce-999a-366d55147a8d" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.466184 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.499398 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fab508b8-e8d9-40ce-999a-366d55147a8d-openstack-config\") pod \"openstackclient\" (UID: \"fab508b8-e8d9-40ce-999a-366d55147a8d\") " pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.499465 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4v68\" (UniqueName: \"kubernetes.io/projected/fab508b8-e8d9-40ce-999a-366d55147a8d-kube-api-access-z4v68\") pod \"openstackclient\" (UID: \"fab508b8-e8d9-40ce-999a-366d55147a8d\") " pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.499487 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fab508b8-e8d9-40ce-999a-366d55147a8d-openstack-config-secret\") pod \"openstackclient\" (UID: \"fab508b8-e8d9-40ce-999a-366d55147a8d\") " pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.499544 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fab508b8-e8d9-40ce-999a-366d55147a8d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fab508b8-e8d9-40ce-999a-366d55147a8d\") " pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.500912 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fab508b8-e8d9-40ce-999a-366d55147a8d-openstack-config\") pod \"openstackclient\" (UID: \"fab508b8-e8d9-40ce-999a-366d55147a8d\") " pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.503334 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.505963 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.506640 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fab508b8-e8d9-40ce-999a-366d55147a8d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fab508b8-e8d9-40ce-999a-366d55147a8d\") " pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: E1001 12:53:57.506227 4727 projected.go:194] Error preparing data for projected volume kube-api-access-z4v68 for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (fab508b8-e8d9-40ce-999a-366d55147a8d) does not match the UID in record. The object might have been deleted and then recreated Oct 01 12:53:57 crc kubenswrapper[4727]: E1001 12:53:57.506745 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fab508b8-e8d9-40ce-999a-366d55147a8d-kube-api-access-z4v68 podName:fab508b8-e8d9-40ce-999a-366d55147a8d nodeName:}" failed. No retries permitted until 2025-10-01 12:53:58.006724171 +0000 UTC m=+1016.328079008 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-z4v68" (UniqueName: "kubernetes.io/projected/fab508b8-e8d9-40ce-999a-366d55147a8d-kube-api-access-z4v68") pod "openstackclient" (UID: "fab508b8-e8d9-40ce-999a-366d55147a8d") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (fab508b8-e8d9-40ce-999a-366d55147a8d) does not match the UID in record. The object might have been deleted and then recreated Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.522226 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fab508b8-e8d9-40ce-999a-366d55147a8d-openstack-config-secret\") pod \"openstackclient\" (UID: \"fab508b8-e8d9-40ce-999a-366d55147a8d\") " pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.534168 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.540355 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.548083 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nkrx8" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.549576 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nkrx8" event={"ID":"d4da191c-6509-4bb7-b9b2-344f8224ae58","Type":"ContainerDied","Data":"626614f1e1533b93ced26cb42f32d8b94e6fec134921fca558ff02056e628089"} Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.549639 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="626614f1e1533b93ced26cb42f32d8b94e6fec134921fca558ff02056e628089" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.559582 4727 generic.go:334] "Generic (PLEG): container finished" podID="5746629a-ce5e-4404-8996-165034633b9e" containerID="031f6669cf8749f3fcf4677257b33ab960904b0aea2446f27aae497cffcf9f2b" exitCode=0 Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.559747 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.561306 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-74wc5" event={"ID":"5746629a-ce5e-4404-8996-165034633b9e","Type":"ContainerDied","Data":"031f6669cf8749f3fcf4677257b33ab960904b0aea2446f27aae497cffcf9f2b"} Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.576053 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.589021 4727 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="fab508b8-e8d9-40ce-999a-366d55147a8d" podUID="fc493472-2f4d-4d92-9ba3-22850bd45ae6" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.601423 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fab508b8-e8d9-40ce-999a-366d55147a8d-openstack-config\") pod \"fab508b8-e8d9-40ce-999a-366d55147a8d\" (UID: \"fab508b8-e8d9-40ce-999a-366d55147a8d\") " Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.601543 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fab508b8-e8d9-40ce-999a-366d55147a8d-openstack-config-secret\") pod \"fab508b8-e8d9-40ce-999a-366d55147a8d\" (UID: \"fab508b8-e8d9-40ce-999a-366d55147a8d\") " Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.601606 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fab508b8-e8d9-40ce-999a-366d55147a8d-combined-ca-bundle\") pod \"fab508b8-e8d9-40ce-999a-366d55147a8d\" (UID: \"fab508b8-e8d9-40ce-999a-366d55147a8d\") " Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.602117 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fc493472-2f4d-4d92-9ba3-22850bd45ae6-openstack-config\") pod \"openstackclient\" (UID: \"fc493472-2f4d-4d92-9ba3-22850bd45ae6\") " pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.602236 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wldq\" (UniqueName: \"kubernetes.io/projected/fc493472-2f4d-4d92-9ba3-22850bd45ae6-kube-api-access-2wldq\") pod \"openstackclient\" (UID: \"fc493472-2f4d-4d92-9ba3-22850bd45ae6\") " pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.602387 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc493472-2f4d-4d92-9ba3-22850bd45ae6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fc493472-2f4d-4d92-9ba3-22850bd45ae6\") " pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.602452 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fc493472-2f4d-4d92-9ba3-22850bd45ae6-openstack-config-secret\") pod \"openstackclient\" (UID: \"fc493472-2f4d-4d92-9ba3-22850bd45ae6\") " pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.602533 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4v68\" (UniqueName: \"kubernetes.io/projected/fab508b8-e8d9-40ce-999a-366d55147a8d-kube-api-access-z4v68\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.602793 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fab508b8-e8d9-40ce-999a-366d55147a8d-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "fab508b8-e8d9-40ce-999a-366d55147a8d" (UID: "fab508b8-e8d9-40ce-999a-366d55147a8d"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.607066 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fab508b8-e8d9-40ce-999a-366d55147a8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fab508b8-e8d9-40ce-999a-366d55147a8d" (UID: "fab508b8-e8d9-40ce-999a-366d55147a8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.607271 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fab508b8-e8d9-40ce-999a-366d55147a8d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "fab508b8-e8d9-40ce-999a-366d55147a8d" (UID: "fab508b8-e8d9-40ce-999a-366d55147a8d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.704563 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fc493472-2f4d-4d92-9ba3-22850bd45ae6-openstack-config\") pod \"openstackclient\" (UID: \"fc493472-2f4d-4d92-9ba3-22850bd45ae6\") " pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.704665 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wldq\" (UniqueName: \"kubernetes.io/projected/fc493472-2f4d-4d92-9ba3-22850bd45ae6-kube-api-access-2wldq\") pod \"openstackclient\" (UID: \"fc493472-2f4d-4d92-9ba3-22850bd45ae6\") " pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.704743 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc493472-2f4d-4d92-9ba3-22850bd45ae6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fc493472-2f4d-4d92-9ba3-22850bd45ae6\") " pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.704777 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fc493472-2f4d-4d92-9ba3-22850bd45ae6-openstack-config-secret\") pod \"openstackclient\" (UID: \"fc493472-2f4d-4d92-9ba3-22850bd45ae6\") " pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.704859 4727 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fab508b8-e8d9-40ce-999a-366d55147a8d-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.704872 4727 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fab508b8-e8d9-40ce-999a-366d55147a8d-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.704884 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fab508b8-e8d9-40ce-999a-366d55147a8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.705782 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fc493472-2f4d-4d92-9ba3-22850bd45ae6-openstack-config\") pod \"openstackclient\" (UID: \"fc493472-2f4d-4d92-9ba3-22850bd45ae6\") " pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.710217 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fc493472-2f4d-4d92-9ba3-22850bd45ae6-openstack-config-secret\") pod \"openstackclient\" (UID: \"fc493472-2f4d-4d92-9ba3-22850bd45ae6\") " pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.710860 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc493472-2f4d-4d92-9ba3-22850bd45ae6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fc493472-2f4d-4d92-9ba3-22850bd45ae6\") " pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.752204 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-jpssp"] Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.753845 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wldq\" (UniqueName: \"kubernetes.io/projected/fc493472-2f4d-4d92-9ba3-22850bd45ae6-kube-api-access-2wldq\") pod \"openstackclient\" (UID: \"fc493472-2f4d-4d92-9ba3-22850bd45ae6\") " pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.760667 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-jpssp" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.778324 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-jpssp"] Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.806186 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-jpssp\" (UID: \"f720599a-1317-44ee-a6c4-72581187a9ad\") " pod="openstack/dnsmasq-dns-75c8ddd69c-jpssp" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.806248 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-config\") pod \"dnsmasq-dns-75c8ddd69c-jpssp\" (UID: \"f720599a-1317-44ee-a6c4-72581187a9ad\") " pod="openstack/dnsmasq-dns-75c8ddd69c-jpssp" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.806289 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv2nf\" (UniqueName: \"kubernetes.io/projected/f720599a-1317-44ee-a6c4-72581187a9ad-kube-api-access-xv2nf\") pod \"dnsmasq-dns-75c8ddd69c-jpssp\" (UID: \"f720599a-1317-44ee-a6c4-72581187a9ad\") " pod="openstack/dnsmasq-dns-75c8ddd69c-jpssp" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.806452 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-jpssp\" (UID: \"f720599a-1317-44ee-a6c4-72581187a9ad\") " pod="openstack/dnsmasq-dns-75c8ddd69c-jpssp" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.806573 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-jpssp\" (UID: \"f720599a-1317-44ee-a6c4-72581187a9ad\") " pod="openstack/dnsmasq-dns-75c8ddd69c-jpssp" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.806809 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-jpssp\" (UID: \"f720599a-1317-44ee-a6c4-72581187a9ad\") " pod="openstack/dnsmasq-dns-75c8ddd69c-jpssp" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.826770 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.908270 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-jpssp\" (UID: \"f720599a-1317-44ee-a6c4-72581187a9ad\") " pod="openstack/dnsmasq-dns-75c8ddd69c-jpssp" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.908362 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-jpssp\" (UID: \"f720599a-1317-44ee-a6c4-72581187a9ad\") " pod="openstack/dnsmasq-dns-75c8ddd69c-jpssp" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.908409 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-config\") pod \"dnsmasq-dns-75c8ddd69c-jpssp\" (UID: \"f720599a-1317-44ee-a6c4-72581187a9ad\") " pod="openstack/dnsmasq-dns-75c8ddd69c-jpssp" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.908444 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv2nf\" (UniqueName: \"kubernetes.io/projected/f720599a-1317-44ee-a6c4-72581187a9ad-kube-api-access-xv2nf\") pod \"dnsmasq-dns-75c8ddd69c-jpssp\" (UID: \"f720599a-1317-44ee-a6c4-72581187a9ad\") " pod="openstack/dnsmasq-dns-75c8ddd69c-jpssp" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.908543 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-jpssp\" (UID: \"f720599a-1317-44ee-a6c4-72581187a9ad\") " pod="openstack/dnsmasq-dns-75c8ddd69c-jpssp" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.908958 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-jpssp\" (UID: \"f720599a-1317-44ee-a6c4-72581187a9ad\") " pod="openstack/dnsmasq-dns-75c8ddd69c-jpssp" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.910764 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-ovsdbserver-nb\") pod \"dnsmasq-dns-75c8ddd69c-jpssp\" (UID: \"f720599a-1317-44ee-a6c4-72581187a9ad\") " pod="openstack/dnsmasq-dns-75c8ddd69c-jpssp" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.910961 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-dns-svc\") pod \"dnsmasq-dns-75c8ddd69c-jpssp\" (UID: \"f720599a-1317-44ee-a6c4-72581187a9ad\") " pod="openstack/dnsmasq-dns-75c8ddd69c-jpssp" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.911636 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-ovsdbserver-sb\") pod \"dnsmasq-dns-75c8ddd69c-jpssp\" (UID: \"f720599a-1317-44ee-a6c4-72581187a9ad\") " pod="openstack/dnsmasq-dns-75c8ddd69c-jpssp" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.911850 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-dns-swift-storage-0\") pod \"dnsmasq-dns-75c8ddd69c-jpssp\" (UID: \"f720599a-1317-44ee-a6c4-72581187a9ad\") " pod="openstack/dnsmasq-dns-75c8ddd69c-jpssp" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.915697 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-config\") pod \"dnsmasq-dns-75c8ddd69c-jpssp\" (UID: \"f720599a-1317-44ee-a6c4-72581187a9ad\") " pod="openstack/dnsmasq-dns-75c8ddd69c-jpssp" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.929599 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv2nf\" (UniqueName: \"kubernetes.io/projected/f720599a-1317-44ee-a6c4-72581187a9ad-kube-api-access-xv2nf\") pod \"dnsmasq-dns-75c8ddd69c-jpssp\" (UID: \"f720599a-1317-44ee-a6c4-72581187a9ad\") " pod="openstack/dnsmasq-dns-75c8ddd69c-jpssp" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.974026 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5688b44d4b-ns86z"] Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.975450 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5688b44d4b-ns86z" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.982075 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.983489 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.983707 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.983825 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-zjn49" Oct 01 12:53:57 crc kubenswrapper[4727]: I1001 12:53:57.992545 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5688b44d4b-ns86z"] Oct 01 12:53:58 crc kubenswrapper[4727]: I1001 12:53:58.009536 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/53c63404-aa0e-4a37-9aaf-f75e8c50831a-httpd-config\") pod \"neutron-5688b44d4b-ns86z\" (UID: \"53c63404-aa0e-4a37-9aaf-f75e8c50831a\") " pod="openstack/neutron-5688b44d4b-ns86z" Oct 01 12:53:58 crc kubenswrapper[4727]: I1001 12:53:58.009616 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/53c63404-aa0e-4a37-9aaf-f75e8c50831a-ovndb-tls-certs\") pod \"neutron-5688b44d4b-ns86z\" (UID: \"53c63404-aa0e-4a37-9aaf-f75e8c50831a\") " pod="openstack/neutron-5688b44d4b-ns86z" Oct 01 12:53:58 crc kubenswrapper[4727]: I1001 12:53:58.009637 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwxzr\" (UniqueName: \"kubernetes.io/projected/53c63404-aa0e-4a37-9aaf-f75e8c50831a-kube-api-access-vwxzr\") pod \"neutron-5688b44d4b-ns86z\" (UID: \"53c63404-aa0e-4a37-9aaf-f75e8c50831a\") " pod="openstack/neutron-5688b44d4b-ns86z" Oct 01 12:53:58 crc kubenswrapper[4727]: I1001 12:53:58.009662 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c63404-aa0e-4a37-9aaf-f75e8c50831a-combined-ca-bundle\") pod \"neutron-5688b44d4b-ns86z\" (UID: \"53c63404-aa0e-4a37-9aaf-f75e8c50831a\") " pod="openstack/neutron-5688b44d4b-ns86z" Oct 01 12:53:58 crc kubenswrapper[4727]: I1001 12:53:58.009697 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/53c63404-aa0e-4a37-9aaf-f75e8c50831a-config\") pod \"neutron-5688b44d4b-ns86z\" (UID: \"53c63404-aa0e-4a37-9aaf-f75e8c50831a\") " pod="openstack/neutron-5688b44d4b-ns86z" Oct 01 12:53:58 crc kubenswrapper[4727]: I1001 12:53:58.091740 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-jpssp" Oct 01 12:53:58 crc kubenswrapper[4727]: I1001 12:53:58.111617 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwxzr\" (UniqueName: \"kubernetes.io/projected/53c63404-aa0e-4a37-9aaf-f75e8c50831a-kube-api-access-vwxzr\") pod \"neutron-5688b44d4b-ns86z\" (UID: \"53c63404-aa0e-4a37-9aaf-f75e8c50831a\") " pod="openstack/neutron-5688b44d4b-ns86z" Oct 01 12:53:58 crc kubenswrapper[4727]: I1001 12:53:58.111678 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c63404-aa0e-4a37-9aaf-f75e8c50831a-combined-ca-bundle\") pod \"neutron-5688b44d4b-ns86z\" (UID: \"53c63404-aa0e-4a37-9aaf-f75e8c50831a\") " pod="openstack/neutron-5688b44d4b-ns86z" Oct 01 12:53:58 crc kubenswrapper[4727]: I1001 12:53:58.111733 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/53c63404-aa0e-4a37-9aaf-f75e8c50831a-config\") pod \"neutron-5688b44d4b-ns86z\" (UID: \"53c63404-aa0e-4a37-9aaf-f75e8c50831a\") " pod="openstack/neutron-5688b44d4b-ns86z" Oct 01 12:53:58 crc kubenswrapper[4727]: I1001 12:53:58.111864 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/53c63404-aa0e-4a37-9aaf-f75e8c50831a-httpd-config\") pod \"neutron-5688b44d4b-ns86z\" (UID: \"53c63404-aa0e-4a37-9aaf-f75e8c50831a\") " pod="openstack/neutron-5688b44d4b-ns86z" Oct 01 12:53:58 crc kubenswrapper[4727]: I1001 12:53:58.111938 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/53c63404-aa0e-4a37-9aaf-f75e8c50831a-ovndb-tls-certs\") pod \"neutron-5688b44d4b-ns86z\" (UID: \"53c63404-aa0e-4a37-9aaf-f75e8c50831a\") " pod="openstack/neutron-5688b44d4b-ns86z" Oct 01 12:53:58 crc kubenswrapper[4727]: I1001 12:53:58.125701 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c63404-aa0e-4a37-9aaf-f75e8c50831a-combined-ca-bundle\") pod \"neutron-5688b44d4b-ns86z\" (UID: \"53c63404-aa0e-4a37-9aaf-f75e8c50831a\") " pod="openstack/neutron-5688b44d4b-ns86z" Oct 01 12:53:58 crc kubenswrapper[4727]: I1001 12:53:58.125932 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/53c63404-aa0e-4a37-9aaf-f75e8c50831a-httpd-config\") pod \"neutron-5688b44d4b-ns86z\" (UID: \"53c63404-aa0e-4a37-9aaf-f75e8c50831a\") " pod="openstack/neutron-5688b44d4b-ns86z" Oct 01 12:53:58 crc kubenswrapper[4727]: I1001 12:53:58.128838 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/53c63404-aa0e-4a37-9aaf-f75e8c50831a-config\") pod \"neutron-5688b44d4b-ns86z\" (UID: \"53c63404-aa0e-4a37-9aaf-f75e8c50831a\") " pod="openstack/neutron-5688b44d4b-ns86z" Oct 01 12:53:58 crc kubenswrapper[4727]: I1001 12:53:58.130217 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/53c63404-aa0e-4a37-9aaf-f75e8c50831a-ovndb-tls-certs\") pod \"neutron-5688b44d4b-ns86z\" (UID: \"53c63404-aa0e-4a37-9aaf-f75e8c50831a\") " pod="openstack/neutron-5688b44d4b-ns86z" Oct 01 12:53:58 crc kubenswrapper[4727]: I1001 12:53:58.155963 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwxzr\" (UniqueName: \"kubernetes.io/projected/53c63404-aa0e-4a37-9aaf-f75e8c50831a-kube-api-access-vwxzr\") pod \"neutron-5688b44d4b-ns86z\" (UID: \"53c63404-aa0e-4a37-9aaf-f75e8c50831a\") " pod="openstack/neutron-5688b44d4b-ns86z" Oct 01 12:53:58 crc kubenswrapper[4727]: I1001 12:53:58.357438 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5688b44d4b-ns86z" Oct 01 12:53:58 crc kubenswrapper[4727]: I1001 12:53:58.400606 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25c1560a-bd40-490a-8d86-a71b9a34b7ea" path="/var/lib/kubelet/pods/25c1560a-bd40-490a-8d86-a71b9a34b7ea/volumes" Oct 01 12:53:58 crc kubenswrapper[4727]: I1001 12:53:58.421243 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fab508b8-e8d9-40ce-999a-366d55147a8d" path="/var/lib/kubelet/pods/fab508b8-e8d9-40ce-999a-366d55147a8d/volumes" Oct 01 12:53:58 crc kubenswrapper[4727]: I1001 12:53:58.507176 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 01 12:53:58 crc kubenswrapper[4727]: I1001 12:53:58.646773 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"810e3b5f-3968-469f-8a31-a5426587d78a","Type":"ContainerStarted","Data":"65fe6af4e59ca8c10befa9704b6a8325168876b53e115909dda8ed3711b9ca53"} Oct 01 12:53:58 crc kubenswrapper[4727]: I1001 12:53:58.656787 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"fc493472-2f4d-4d92-9ba3-22850bd45ae6","Type":"ContainerStarted","Data":"a9b2b4698f62c1c5a7bea3d6a47d3b7bdfe9d1257e18d7537771f960e1542eaa"} Oct 01 12:53:58 crc kubenswrapper[4727]: I1001 12:53:58.656927 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 01 12:53:58 crc kubenswrapper[4727]: I1001 12:53:58.677405 4727 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="fab508b8-e8d9-40ce-999a-366d55147a8d" podUID="fc493472-2f4d-4d92-9ba3-22850bd45ae6" Oct 01 12:53:58 crc kubenswrapper[4727]: I1001 12:53:58.912255 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-jpssp"] Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.013665 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7df68f6869-rwfcm"] Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.044221 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.045397 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7df68f6869-rwfcm"] Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.049177 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.049487 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.049634 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.147556 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c5e6c5d-4f10-437f-b20e-f3394093b3b9-config-data\") pod \"swift-proxy-7df68f6869-rwfcm\" (UID: \"7c5e6c5d-4f10-437f-b20e-f3394093b3b9\") " pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.148062 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksr7q\" (UniqueName: \"kubernetes.io/projected/7c5e6c5d-4f10-437f-b20e-f3394093b3b9-kube-api-access-ksr7q\") pod \"swift-proxy-7df68f6869-rwfcm\" (UID: \"7c5e6c5d-4f10-437f-b20e-f3394093b3b9\") " pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.148151 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5e6c5d-4f10-437f-b20e-f3394093b3b9-combined-ca-bundle\") pod \"swift-proxy-7df68f6869-rwfcm\" (UID: \"7c5e6c5d-4f10-437f-b20e-f3394093b3b9\") " pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.148211 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c5e6c5d-4f10-437f-b20e-f3394093b3b9-internal-tls-certs\") pod \"swift-proxy-7df68f6869-rwfcm\" (UID: \"7c5e6c5d-4f10-437f-b20e-f3394093b3b9\") " pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.148308 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c5e6c5d-4f10-437f-b20e-f3394093b3b9-log-httpd\") pod \"swift-proxy-7df68f6869-rwfcm\" (UID: \"7c5e6c5d-4f10-437f-b20e-f3394093b3b9\") " pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.148448 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7c5e6c5d-4f10-437f-b20e-f3394093b3b9-etc-swift\") pod \"swift-proxy-7df68f6869-rwfcm\" (UID: \"7c5e6c5d-4f10-437f-b20e-f3394093b3b9\") " pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.148508 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c5e6c5d-4f10-437f-b20e-f3394093b3b9-run-httpd\") pod \"swift-proxy-7df68f6869-rwfcm\" (UID: \"7c5e6c5d-4f10-437f-b20e-f3394093b3b9\") " pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.148543 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c5e6c5d-4f10-437f-b20e-f3394093b3b9-public-tls-certs\") pod \"swift-proxy-7df68f6869-rwfcm\" (UID: \"7c5e6c5d-4f10-437f-b20e-f3394093b3b9\") " pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.249901 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5e6c5d-4f10-437f-b20e-f3394093b3b9-combined-ca-bundle\") pod \"swift-proxy-7df68f6869-rwfcm\" (UID: \"7c5e6c5d-4f10-437f-b20e-f3394093b3b9\") " pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.249972 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c5e6c5d-4f10-437f-b20e-f3394093b3b9-internal-tls-certs\") pod \"swift-proxy-7df68f6869-rwfcm\" (UID: \"7c5e6c5d-4f10-437f-b20e-f3394093b3b9\") " pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.250089 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c5e6c5d-4f10-437f-b20e-f3394093b3b9-log-httpd\") pod \"swift-proxy-7df68f6869-rwfcm\" (UID: \"7c5e6c5d-4f10-437f-b20e-f3394093b3b9\") " pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.250164 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7c5e6c5d-4f10-437f-b20e-f3394093b3b9-etc-swift\") pod \"swift-proxy-7df68f6869-rwfcm\" (UID: \"7c5e6c5d-4f10-437f-b20e-f3394093b3b9\") " pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.250193 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c5e6c5d-4f10-437f-b20e-f3394093b3b9-run-httpd\") pod \"swift-proxy-7df68f6869-rwfcm\" (UID: \"7c5e6c5d-4f10-437f-b20e-f3394093b3b9\") " pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.250225 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c5e6c5d-4f10-437f-b20e-f3394093b3b9-public-tls-certs\") pod \"swift-proxy-7df68f6869-rwfcm\" (UID: \"7c5e6c5d-4f10-437f-b20e-f3394093b3b9\") " pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.250253 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c5e6c5d-4f10-437f-b20e-f3394093b3b9-config-data\") pod \"swift-proxy-7df68f6869-rwfcm\" (UID: \"7c5e6c5d-4f10-437f-b20e-f3394093b3b9\") " pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.250289 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksr7q\" (UniqueName: \"kubernetes.io/projected/7c5e6c5d-4f10-437f-b20e-f3394093b3b9-kube-api-access-ksr7q\") pod \"swift-proxy-7df68f6869-rwfcm\" (UID: \"7c5e6c5d-4f10-437f-b20e-f3394093b3b9\") " pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.255314 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c5e6c5d-4f10-437f-b20e-f3394093b3b9-log-httpd\") pod \"swift-proxy-7df68f6869-rwfcm\" (UID: \"7c5e6c5d-4f10-437f-b20e-f3394093b3b9\") " pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.260122 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c5e6c5d-4f10-437f-b20e-f3394093b3b9-internal-tls-certs\") pod \"swift-proxy-7df68f6869-rwfcm\" (UID: \"7c5e6c5d-4f10-437f-b20e-f3394093b3b9\") " pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.260688 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5e6c5d-4f10-437f-b20e-f3394093b3b9-combined-ca-bundle\") pod \"swift-proxy-7df68f6869-rwfcm\" (UID: \"7c5e6c5d-4f10-437f-b20e-f3394093b3b9\") " pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.260817 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7c5e6c5d-4f10-437f-b20e-f3394093b3b9-run-httpd\") pod \"swift-proxy-7df68f6869-rwfcm\" (UID: \"7c5e6c5d-4f10-437f-b20e-f3394093b3b9\") " pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.266433 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c5e6c5d-4f10-437f-b20e-f3394093b3b9-config-data\") pod \"swift-proxy-7df68f6869-rwfcm\" (UID: \"7c5e6c5d-4f10-437f-b20e-f3394093b3b9\") " pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.268081 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7c5e6c5d-4f10-437f-b20e-f3394093b3b9-etc-swift\") pod \"swift-proxy-7df68f6869-rwfcm\" (UID: \"7c5e6c5d-4f10-437f-b20e-f3394093b3b9\") " pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.271103 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c5e6c5d-4f10-437f-b20e-f3394093b3b9-public-tls-certs\") pod \"swift-proxy-7df68f6869-rwfcm\" (UID: \"7c5e6c5d-4f10-437f-b20e-f3394093b3b9\") " pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.279965 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-74wc5" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.286790 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5688b44d4b-ns86z"] Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.299485 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksr7q\" (UniqueName: \"kubernetes.io/projected/7c5e6c5d-4f10-437f-b20e-f3394093b3b9-kube-api-access-ksr7q\") pod \"swift-proxy-7df68f6869-rwfcm\" (UID: \"7c5e6c5d-4f10-437f-b20e-f3394093b3b9\") " pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:53:59 crc kubenswrapper[4727]: W1001 12:53:59.324097 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53c63404_aa0e_4a37_9aaf_f75e8c50831a.slice/crio-c4eb480b7d3e0d2bc049bb91a3f646d1c71fa7dfaa97487e3aa3ad256206e652 WatchSource:0}: Error finding container c4eb480b7d3e0d2bc049bb91a3f646d1c71fa7dfaa97487e3aa3ad256206e652: Status 404 returned error can't find the container with id c4eb480b7d3e0d2bc049bb91a3f646d1c71fa7dfaa97487e3aa3ad256206e652 Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.400509 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.453952 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5746629a-ce5e-4404-8996-165034633b9e-db-sync-config-data\") pod \"5746629a-ce5e-4404-8996-165034633b9e\" (UID: \"5746629a-ce5e-4404-8996-165034633b9e\") " Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.454052 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5746629a-ce5e-4404-8996-165034633b9e-scripts\") pod \"5746629a-ce5e-4404-8996-165034633b9e\" (UID: \"5746629a-ce5e-4404-8996-165034633b9e\") " Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.454186 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5746629a-ce5e-4404-8996-165034633b9e-config-data\") pod \"5746629a-ce5e-4404-8996-165034633b9e\" (UID: \"5746629a-ce5e-4404-8996-165034633b9e\") " Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.454211 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5746629a-ce5e-4404-8996-165034633b9e-combined-ca-bundle\") pod \"5746629a-ce5e-4404-8996-165034633b9e\" (UID: \"5746629a-ce5e-4404-8996-165034633b9e\") " Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.454263 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l22hk\" (UniqueName: \"kubernetes.io/projected/5746629a-ce5e-4404-8996-165034633b9e-kube-api-access-l22hk\") pod \"5746629a-ce5e-4404-8996-165034633b9e\" (UID: \"5746629a-ce5e-4404-8996-165034633b9e\") " Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.454303 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5746629a-ce5e-4404-8996-165034633b9e-etc-machine-id\") pod \"5746629a-ce5e-4404-8996-165034633b9e\" (UID: \"5746629a-ce5e-4404-8996-165034633b9e\") " Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.454991 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5746629a-ce5e-4404-8996-165034633b9e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5746629a-ce5e-4404-8996-165034633b9e" (UID: "5746629a-ce5e-4404-8996-165034633b9e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.465655 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5746629a-ce5e-4404-8996-165034633b9e-kube-api-access-l22hk" (OuterVolumeSpecName: "kube-api-access-l22hk") pod "5746629a-ce5e-4404-8996-165034633b9e" (UID: "5746629a-ce5e-4404-8996-165034633b9e"). InnerVolumeSpecName "kube-api-access-l22hk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.466097 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5746629a-ce5e-4404-8996-165034633b9e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5746629a-ce5e-4404-8996-165034633b9e" (UID: "5746629a-ce5e-4404-8996-165034633b9e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.468009 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5746629a-ce5e-4404-8996-165034633b9e-scripts" (OuterVolumeSpecName: "scripts") pod "5746629a-ce5e-4404-8996-165034633b9e" (UID: "5746629a-ce5e-4404-8996-165034633b9e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.538357 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5746629a-ce5e-4404-8996-165034633b9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5746629a-ce5e-4404-8996-165034633b9e" (UID: "5746629a-ce5e-4404-8996-165034633b9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.556464 4727 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5746629a-ce5e-4404-8996-165034633b9e-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.556494 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5746629a-ce5e-4404-8996-165034633b9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.556504 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l22hk\" (UniqueName: \"kubernetes.io/projected/5746629a-ce5e-4404-8996-165034633b9e-kube-api-access-l22hk\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.556513 4727 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5746629a-ce5e-4404-8996-165034633b9e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.556521 4727 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5746629a-ce5e-4404-8996-165034633b9e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.582842 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5746629a-ce5e-4404-8996-165034633b9e-config-data" (OuterVolumeSpecName: "config-data") pod "5746629a-ce5e-4404-8996-165034633b9e" (UID: "5746629a-ce5e-4404-8996-165034633b9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.660042 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5746629a-ce5e-4404-8996-165034633b9e-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.694867 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"810e3b5f-3968-469f-8a31-a5426587d78a","Type":"ContainerStarted","Data":"0ab3c8007acb4882f7977db1196ab7951d07b6eab131768f2137ebbdf531d39e"} Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.694941 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"810e3b5f-3968-469f-8a31-a5426587d78a","Type":"ContainerStarted","Data":"f42b4535112d4770d0f0555d861bf11da4e6eb8063f6a8c64b028dfbbe8af7e0"} Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.701839 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-74wc5" event={"ID":"5746629a-ce5e-4404-8996-165034633b9e","Type":"ContainerDied","Data":"c02577c694ac2f9289bab056b5cb5264f3bd8ac0759174045ceea294b58272cd"} Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.701911 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c02577c694ac2f9289bab056b5cb5264f3bd8ac0759174045ceea294b58272cd" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.702105 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-74wc5" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.769457 4727 generic.go:334] "Generic (PLEG): container finished" podID="f720599a-1317-44ee-a6c4-72581187a9ad" containerID="686be2abdb0b75c4abccfb7784631daf263455511d4b94bfbd5dec441b98c181" exitCode=0 Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.770669 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-jpssp" event={"ID":"f720599a-1317-44ee-a6c4-72581187a9ad","Type":"ContainerDied","Data":"686be2abdb0b75c4abccfb7784631daf263455511d4b94bfbd5dec441b98c181"} Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.770702 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-jpssp" event={"ID":"f720599a-1317-44ee-a6c4-72581187a9ad","Type":"ContainerStarted","Data":"1a007ada57b17ebac7cb4b681ae2c5843e7e49c4fe5046e03fb9024e34f641c5"} Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.795363 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5688b44d4b-ns86z" event={"ID":"53c63404-aa0e-4a37-9aaf-f75e8c50831a","Type":"ContainerStarted","Data":"57d06a2d1470dd4d537ba205ae76ac84eb026f3337bb1a1074280a6153dadffb"} Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.795424 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5688b44d4b-ns86z" event={"ID":"53c63404-aa0e-4a37-9aaf-f75e8c50831a","Type":"ContainerStarted","Data":"c4eb480b7d3e0d2bc049bb91a3f646d1c71fa7dfaa97487e3aa3ad256206e652"} Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.893564 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 12:53:59 crc kubenswrapper[4727]: E1001 12:53:59.894181 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5746629a-ce5e-4404-8996-165034633b9e" containerName="cinder-db-sync" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.894208 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="5746629a-ce5e-4404-8996-165034633b9e" containerName="cinder-db-sync" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.894441 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="5746629a-ce5e-4404-8996-165034633b9e" containerName="cinder-db-sync" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.895777 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.909042 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xkl2d" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.909169 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.913602 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.915059 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.919064 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.925735 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-jpssp"] Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.981458 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edc05528-2111-4383-b904-8ad44aaa0a11-scripts\") pod \"cinder-scheduler-0\" (UID: \"edc05528-2111-4383-b904-8ad44aaa0a11\") " pod="openstack/cinder-scheduler-0" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.981549 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc05528-2111-4383-b904-8ad44aaa0a11-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"edc05528-2111-4383-b904-8ad44aaa0a11\") " pod="openstack/cinder-scheduler-0" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.981640 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hrlk\" (UniqueName: \"kubernetes.io/projected/edc05528-2111-4383-b904-8ad44aaa0a11-kube-api-access-7hrlk\") pod \"cinder-scheduler-0\" (UID: \"edc05528-2111-4383-b904-8ad44aaa0a11\") " pod="openstack/cinder-scheduler-0" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.981681 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/edc05528-2111-4383-b904-8ad44aaa0a11-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"edc05528-2111-4383-b904-8ad44aaa0a11\") " pod="openstack/cinder-scheduler-0" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.981705 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edc05528-2111-4383-b904-8ad44aaa0a11-config-data\") pod \"cinder-scheduler-0\" (UID: \"edc05528-2111-4383-b904-8ad44aaa0a11\") " pod="openstack/cinder-scheduler-0" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.981758 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/edc05528-2111-4383-b904-8ad44aaa0a11-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"edc05528-2111-4383-b904-8ad44aaa0a11\") " pod="openstack/cinder-scheduler-0" Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.988485 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-thhck"] Oct 01 12:53:59 crc kubenswrapper[4727]: I1001 12:53:59.994960 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-thhck" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.028022 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.037469 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.045812 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.058004 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-thhck"] Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.085834 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hrlk\" (UniqueName: \"kubernetes.io/projected/edc05528-2111-4383-b904-8ad44aaa0a11-kube-api-access-7hrlk\") pod \"cinder-scheduler-0\" (UID: \"edc05528-2111-4383-b904-8ad44aaa0a11\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.085897 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-thhck\" (UID: \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\") " pod="openstack/dnsmasq-dns-5784cf869f-thhck" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.085918 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/edc05528-2111-4383-b904-8ad44aaa0a11-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"edc05528-2111-4383-b904-8ad44aaa0a11\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.085935 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edc05528-2111-4383-b904-8ad44aaa0a11-config-data\") pod \"cinder-scheduler-0\" (UID: \"edc05528-2111-4383-b904-8ad44aaa0a11\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.085963 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-dns-svc\") pod \"dnsmasq-dns-5784cf869f-thhck\" (UID: \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\") " pod="openstack/dnsmasq-dns-5784cf869f-thhck" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.085993 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/edc05528-2111-4383-b904-8ad44aaa0a11-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"edc05528-2111-4383-b904-8ad44aaa0a11\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.086064 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-config\") pod \"dnsmasq-dns-5784cf869f-thhck\" (UID: \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\") " pod="openstack/dnsmasq-dns-5784cf869f-thhck" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.086092 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbgwh\" (UniqueName: \"kubernetes.io/projected/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-kube-api-access-qbgwh\") pod \"dnsmasq-dns-5784cf869f-thhck\" (UID: \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\") " pod="openstack/dnsmasq-dns-5784cf869f-thhck" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.086115 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edc05528-2111-4383-b904-8ad44aaa0a11-scripts\") pod \"cinder-scheduler-0\" (UID: \"edc05528-2111-4383-b904-8ad44aaa0a11\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.086153 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-thhck\" (UID: \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\") " pod="openstack/dnsmasq-dns-5784cf869f-thhck" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.086189 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc05528-2111-4383-b904-8ad44aaa0a11-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"edc05528-2111-4383-b904-8ad44aaa0a11\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.086261 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-thhck\" (UID: \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\") " pod="openstack/dnsmasq-dns-5784cf869f-thhck" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.086635 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/edc05528-2111-4383-b904-8ad44aaa0a11-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"edc05528-2111-4383-b904-8ad44aaa0a11\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.097160 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.115417 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edc05528-2111-4383-b904-8ad44aaa0a11-config-data\") pod \"cinder-scheduler-0\" (UID: \"edc05528-2111-4383-b904-8ad44aaa0a11\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.118126 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edc05528-2111-4383-b904-8ad44aaa0a11-scripts\") pod \"cinder-scheduler-0\" (UID: \"edc05528-2111-4383-b904-8ad44aaa0a11\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.131347 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/edc05528-2111-4383-b904-8ad44aaa0a11-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"edc05528-2111-4383-b904-8ad44aaa0a11\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.136886 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc05528-2111-4383-b904-8ad44aaa0a11-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"edc05528-2111-4383-b904-8ad44aaa0a11\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.138975 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hrlk\" (UniqueName: \"kubernetes.io/projected/edc05528-2111-4383-b904-8ad44aaa0a11-kube-api-access-7hrlk\") pod \"cinder-scheduler-0\" (UID: \"edc05528-2111-4383-b904-8ad44aaa0a11\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.188725 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab0a0781-bd35-4809-a999-fb592655bcc5-scripts\") pod \"cinder-api-0\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " pod="openstack/cinder-api-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.188819 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-config\") pod \"dnsmasq-dns-5784cf869f-thhck\" (UID: \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\") " pod="openstack/dnsmasq-dns-5784cf869f-thhck" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.189482 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ab0a0781-bd35-4809-a999-fb592655bcc5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " pod="openstack/cinder-api-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.189599 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbgwh\" (UniqueName: \"kubernetes.io/projected/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-kube-api-access-qbgwh\") pod \"dnsmasq-dns-5784cf869f-thhck\" (UID: \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\") " pod="openstack/dnsmasq-dns-5784cf869f-thhck" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.189754 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-thhck\" (UID: \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\") " pod="openstack/dnsmasq-dns-5784cf869f-thhck" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.189907 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm9tf\" (UniqueName: \"kubernetes.io/projected/ab0a0781-bd35-4809-a999-fb592655bcc5-kube-api-access-dm9tf\") pod \"cinder-api-0\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " pod="openstack/cinder-api-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.192043 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0a0781-bd35-4809-a999-fb592655bcc5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " pod="openstack/cinder-api-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.195866 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab0a0781-bd35-4809-a999-fb592655bcc5-logs\") pod \"cinder-api-0\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " pod="openstack/cinder-api-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.197065 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-thhck\" (UID: \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\") " pod="openstack/dnsmasq-dns-5784cf869f-thhck" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.197205 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0a0781-bd35-4809-a999-fb592655bcc5-config-data\") pod \"cinder-api-0\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " pod="openstack/cinder-api-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.197335 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-thhck\" (UID: \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\") " pod="openstack/dnsmasq-dns-5784cf869f-thhck" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.197493 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-dns-svc\") pod \"dnsmasq-dns-5784cf869f-thhck\" (UID: \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\") " pod="openstack/dnsmasq-dns-5784cf869f-thhck" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.197596 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab0a0781-bd35-4809-a999-fb592655bcc5-config-data-custom\") pod \"cinder-api-0\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " pod="openstack/cinder-api-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.193299 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-config\") pod \"dnsmasq-dns-5784cf869f-thhck\" (UID: \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\") " pod="openstack/dnsmasq-dns-5784cf869f-thhck" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.195810 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-thhck\" (UID: \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\") " pod="openstack/dnsmasq-dns-5784cf869f-thhck" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.203300 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-thhck\" (UID: \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\") " pod="openstack/dnsmasq-dns-5784cf869f-thhck" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.205982 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-thhck\" (UID: \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\") " pod="openstack/dnsmasq-dns-5784cf869f-thhck" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.206212 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-dns-svc\") pod \"dnsmasq-dns-5784cf869f-thhck\" (UID: \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\") " pod="openstack/dnsmasq-dns-5784cf869f-thhck" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.244364 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbgwh\" (UniqueName: \"kubernetes.io/projected/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-kube-api-access-qbgwh\") pod \"dnsmasq-dns-5784cf869f-thhck\" (UID: \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\") " pod="openstack/dnsmasq-dns-5784cf869f-thhck" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.301237 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab0a0781-bd35-4809-a999-fb592655bcc5-config-data-custom\") pod \"cinder-api-0\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " pod="openstack/cinder-api-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.301312 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab0a0781-bd35-4809-a999-fb592655bcc5-scripts\") pod \"cinder-api-0\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " pod="openstack/cinder-api-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.301372 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ab0a0781-bd35-4809-a999-fb592655bcc5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " pod="openstack/cinder-api-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.301448 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm9tf\" (UniqueName: \"kubernetes.io/projected/ab0a0781-bd35-4809-a999-fb592655bcc5-kube-api-access-dm9tf\") pod \"cinder-api-0\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " pod="openstack/cinder-api-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.301521 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0a0781-bd35-4809-a999-fb592655bcc5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " pod="openstack/cinder-api-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.301547 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab0a0781-bd35-4809-a999-fb592655bcc5-logs\") pod \"cinder-api-0\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " pod="openstack/cinder-api-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.301599 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0a0781-bd35-4809-a999-fb592655bcc5-config-data\") pod \"cinder-api-0\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " pod="openstack/cinder-api-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.307518 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ab0a0781-bd35-4809-a999-fb592655bcc5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " pod="openstack/cinder-api-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.311027 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab0a0781-bd35-4809-a999-fb592655bcc5-config-data-custom\") pod \"cinder-api-0\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " pod="openstack/cinder-api-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.313240 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab0a0781-bd35-4809-a999-fb592655bcc5-logs\") pod \"cinder-api-0\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " pod="openstack/cinder-api-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.314548 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0a0781-bd35-4809-a999-fb592655bcc5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " pod="openstack/cinder-api-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.316754 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab0a0781-bd35-4809-a999-fb592655bcc5-scripts\") pod \"cinder-api-0\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " pod="openstack/cinder-api-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.322859 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.328049 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0a0781-bd35-4809-a999-fb592655bcc5-config-data\") pod \"cinder-api-0\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " pod="openstack/cinder-api-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.333773 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm9tf\" (UniqueName: \"kubernetes.io/projected/ab0a0781-bd35-4809-a999-fb592655bcc5-kube-api-access-dm9tf\") pod \"cinder-api-0\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " pod="openstack/cinder-api-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.377410 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-thhck" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.425079 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7df68f6869-rwfcm"] Oct 01 12:54:00 crc kubenswrapper[4727]: E1001 12:54:00.493181 4727 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 01 12:54:00 crc kubenswrapper[4727]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/f720599a-1317-44ee-a6c4-72581187a9ad/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 01 12:54:00 crc kubenswrapper[4727]: > podSandboxID="1a007ada57b17ebac7cb4b681ae2c5843e7e49c4fe5046e03fb9024e34f641c5" Oct 01 12:54:00 crc kubenswrapper[4727]: E1001 12:54:00.493341 4727 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 01 12:54:00 crc kubenswrapper[4727]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n566h86hd4h5f9hc8h599h5h56bh75h554h597h5f4hb7h98h58fh66ch57ch668h5bfhd8h596h68dh54h8ch674h587h5bdhb9hc4h695h5b8hccq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xv2nf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-75c8ddd69c-jpssp_openstack(f720599a-1317-44ee-a6c4-72581187a9ad): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/f720599a-1317-44ee-a6c4-72581187a9ad/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 01 12:54:00 crc kubenswrapper[4727]: > logger="UnhandledError" Oct 01 12:54:00 crc kubenswrapper[4727]: E1001 12:54:00.494763 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/f720599a-1317-44ee-a6c4-72581187a9ad/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-75c8ddd69c-jpssp" podUID="f720599a-1317-44ee-a6c4-72581187a9ad" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.526717 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.539417 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.831590 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"810e3b5f-3968-469f-8a31-a5426587d78a","Type":"ContainerStarted","Data":"345764f7582ecbaa73cc3dbdd8ce2b3a94cd865bc970655933456adb8026a488"} Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.835215 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7df68f6869-rwfcm" event={"ID":"7c5e6c5d-4f10-437f-b20e-f3394093b3b9","Type":"ContainerStarted","Data":"6db23465f6d8e70d53c7d2d517fe351970f0928909d9b0b14f9fc8cd149b07b3"} Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.854851 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5688b44d4b-ns86z" event={"ID":"53c63404-aa0e-4a37-9aaf-f75e8c50831a","Type":"ContainerStarted","Data":"6e35902325bbfbc610695e4b063a45caa0e78489199df5def2be80f434d7d0ca"} Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.854916 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5688b44d4b-ns86z" Oct 01 12:54:00 crc kubenswrapper[4727]: I1001 12:54:00.887274 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5688b44d4b-ns86z" podStartSLOduration=3.887253222 podStartE2EDuration="3.887253222s" podCreationTimestamp="2025-10-01 12:53:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:00.8753437 +0000 UTC m=+1019.196698557" watchObservedRunningTime="2025-10-01 12:54:00.887253222 +0000 UTC m=+1019.208608059" Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.174629 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.405531 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-thhck"] Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.512362 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.512981 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-jpssp" Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.657299 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-dns-svc\") pod \"f720599a-1317-44ee-a6c4-72581187a9ad\" (UID: \"f720599a-1317-44ee-a6c4-72581187a9ad\") " Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.657629 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-ovsdbserver-nb\") pod \"f720599a-1317-44ee-a6c4-72581187a9ad\" (UID: \"f720599a-1317-44ee-a6c4-72581187a9ad\") " Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.657831 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv2nf\" (UniqueName: \"kubernetes.io/projected/f720599a-1317-44ee-a6c4-72581187a9ad-kube-api-access-xv2nf\") pod \"f720599a-1317-44ee-a6c4-72581187a9ad\" (UID: \"f720599a-1317-44ee-a6c4-72581187a9ad\") " Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.657969 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-dns-swift-storage-0\") pod \"f720599a-1317-44ee-a6c4-72581187a9ad\" (UID: \"f720599a-1317-44ee-a6c4-72581187a9ad\") " Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.658095 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-ovsdbserver-sb\") pod \"f720599a-1317-44ee-a6c4-72581187a9ad\" (UID: \"f720599a-1317-44ee-a6c4-72581187a9ad\") " Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.658476 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-config\") pod \"f720599a-1317-44ee-a6c4-72581187a9ad\" (UID: \"f720599a-1317-44ee-a6c4-72581187a9ad\") " Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.672684 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f720599a-1317-44ee-a6c4-72581187a9ad-kube-api-access-xv2nf" (OuterVolumeSpecName: "kube-api-access-xv2nf") pod "f720599a-1317-44ee-a6c4-72581187a9ad" (UID: "f720599a-1317-44ee-a6c4-72581187a9ad"). InnerVolumeSpecName "kube-api-access-xv2nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.761991 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv2nf\" (UniqueName: \"kubernetes.io/projected/f720599a-1317-44ee-a6c4-72581187a9ad-kube-api-access-xv2nf\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.782809 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f720599a-1317-44ee-a6c4-72581187a9ad" (UID: "f720599a-1317-44ee-a6c4-72581187a9ad"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.864475 4727 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.865669 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f720599a-1317-44ee-a6c4-72581187a9ad" (UID: "f720599a-1317-44ee-a6c4-72581187a9ad"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.882666 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f720599a-1317-44ee-a6c4-72581187a9ad" (UID: "f720599a-1317-44ee-a6c4-72581187a9ad"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.893309 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-config" (OuterVolumeSpecName: "config") pod "f720599a-1317-44ee-a6c4-72581187a9ad" (UID: "f720599a-1317-44ee-a6c4-72581187a9ad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.897860 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f720599a-1317-44ee-a6c4-72581187a9ad" (UID: "f720599a-1317-44ee-a6c4-72581187a9ad"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.970350 4727 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.970948 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.971102 4727 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.971116 4727 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f720599a-1317-44ee-a6c4-72581187a9ad-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.971451 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7df68f6869-rwfcm" event={"ID":"7c5e6c5d-4f10-437f-b20e-f3394093b3b9","Type":"ContainerStarted","Data":"ecb6623b18ae5198f8d8f05249c08f6b3a8a1225d6c6dab4678c3145c72fa258"} Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.971484 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7df68f6869-rwfcm" event={"ID":"7c5e6c5d-4f10-437f-b20e-f3394093b3b9","Type":"ContainerStarted","Data":"42384511e7949d98a18a51fe4169131c73a9bbccf966eaade3ad5f3b3d6e9ded"} Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.971518 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.971536 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.974973 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ab0a0781-bd35-4809-a999-fb592655bcc5","Type":"ContainerStarted","Data":"5df089435d1aeb50afc84a8c030a4e2349ad3eb08e8c3f737cfc9e7c641ac1e7"} Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.981681 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"edc05528-2111-4383-b904-8ad44aaa0a11","Type":"ContainerStarted","Data":"fe7edf6f8484d18cf390652800d7cd5a44c24eb4300debb39bc9969082b4b24c"} Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.985241 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c8ddd69c-jpssp" event={"ID":"f720599a-1317-44ee-a6c4-72581187a9ad","Type":"ContainerDied","Data":"1a007ada57b17ebac7cb4b681ae2c5843e7e49c4fe5046e03fb9024e34f641c5"} Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.985408 4727 scope.go:117] "RemoveContainer" containerID="686be2abdb0b75c4abccfb7784631daf263455511d4b94bfbd5dec441b98c181" Oct 01 12:54:01 crc kubenswrapper[4727]: I1001 12:54:01.985583 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c8ddd69c-jpssp" Oct 01 12:54:02 crc kubenswrapper[4727]: I1001 12:54:02.000996 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7df68f6869-rwfcm" podStartSLOduration=4.000972675 podStartE2EDuration="4.000972675s" podCreationTimestamp="2025-10-01 12:53:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:01.999413196 +0000 UTC m=+1020.320768053" watchObservedRunningTime="2025-10-01 12:54:02.000972675 +0000 UTC m=+1020.322327522" Oct 01 12:54:02 crc kubenswrapper[4727]: I1001 12:54:02.010768 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-thhck" event={"ID":"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0","Type":"ContainerStarted","Data":"e3215d1dd88a5196137e727b9e2db015b4325c899f36b4175fe83b5a6b244d14"} Oct 01 12:54:02 crc kubenswrapper[4727]: I1001 12:54:02.326763 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-jpssp"] Oct 01 12:54:02 crc kubenswrapper[4727]: I1001 12:54:02.337275 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c8ddd69c-jpssp"] Oct 01 12:54:02 crc kubenswrapper[4727]: I1001 12:54:02.400505 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f720599a-1317-44ee-a6c4-72581187a9ad" path="/var/lib/kubelet/pods/f720599a-1317-44ee-a6c4-72581187a9ad/volumes" Oct 01 12:54:03 crc kubenswrapper[4727]: I1001 12:54:03.042560 4727 generic.go:334] "Generic (PLEG): container finished" podID="dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0" containerID="510d5edffe1462f2a9bc5254c8aa0f392fde857c01445c509f6c935e4839da71" exitCode=0 Oct 01 12:54:03 crc kubenswrapper[4727]: I1001 12:54:03.044643 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-thhck" event={"ID":"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0","Type":"ContainerDied","Data":"510d5edffe1462f2a9bc5254c8aa0f392fde857c01445c509f6c935e4839da71"} Oct 01 12:54:03 crc kubenswrapper[4727]: I1001 12:54:03.044699 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-thhck" Oct 01 12:54:03 crc kubenswrapper[4727]: I1001 12:54:03.044715 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-thhck" event={"ID":"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0","Type":"ContainerStarted","Data":"759006ab7302535a7223715dd8e5b23bbc8d0cd4ded5a1c343658428088b08a8"} Oct 01 12:54:03 crc kubenswrapper[4727]: I1001 12:54:03.061770 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"810e3b5f-3968-469f-8a31-a5426587d78a","Type":"ContainerStarted","Data":"ce87f31ed80d5006851eee2237f3b16ab754bbff8c4d489a051b218708b6c875"} Oct 01 12:54:03 crc kubenswrapper[4727]: I1001 12:54:03.061969 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="810e3b5f-3968-469f-8a31-a5426587d78a" containerName="ceilometer-central-agent" containerID="cri-o://0ab3c8007acb4882f7977db1196ab7951d07b6eab131768f2137ebbdf531d39e" gracePeriod=30 Oct 01 12:54:03 crc kubenswrapper[4727]: I1001 12:54:03.062270 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 12:54:03 crc kubenswrapper[4727]: I1001 12:54:03.062336 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="810e3b5f-3968-469f-8a31-a5426587d78a" containerName="proxy-httpd" containerID="cri-o://ce87f31ed80d5006851eee2237f3b16ab754bbff8c4d489a051b218708b6c875" gracePeriod=30 Oct 01 12:54:03 crc kubenswrapper[4727]: I1001 12:54:03.062390 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="810e3b5f-3968-469f-8a31-a5426587d78a" containerName="sg-core" containerID="cri-o://345764f7582ecbaa73cc3dbdd8ce2b3a94cd865bc970655933456adb8026a488" gracePeriod=30 Oct 01 12:54:03 crc kubenswrapper[4727]: I1001 12:54:03.062464 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="810e3b5f-3968-469f-8a31-a5426587d78a" containerName="ceilometer-notification-agent" containerID="cri-o://f42b4535112d4770d0f0555d861bf11da4e6eb8063f6a8c64b028dfbbe8af7e0" gracePeriod=30 Oct 01 12:54:03 crc kubenswrapper[4727]: I1001 12:54:03.092934 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ab0a0781-bd35-4809-a999-fb592655bcc5","Type":"ContainerStarted","Data":"3de682e51bde436d01b06b04ae16f153879284d191cbb8f0440e78f1f577cfb4"} Oct 01 12:54:03 crc kubenswrapper[4727]: I1001 12:54:03.114128 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-thhck" podStartSLOduration=4.114104711 podStartE2EDuration="4.114104711s" podCreationTimestamp="2025-10-01 12:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:03.10093666 +0000 UTC m=+1021.422291497" watchObservedRunningTime="2025-10-01 12:54:03.114104711 +0000 UTC m=+1021.435459548" Oct 01 12:54:03 crc kubenswrapper[4727]: I1001 12:54:03.151610 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.646297733 podStartE2EDuration="7.151592762s" podCreationTimestamp="2025-10-01 12:53:56 +0000 UTC" firstStartedPulling="2025-10-01 12:53:57.546814464 +0000 UTC m=+1015.868169301" lastFinishedPulling="2025-10-01 12:54:02.052109493 +0000 UTC m=+1020.373464330" observedRunningTime="2025-10-01 12:54:03.149698433 +0000 UTC m=+1021.471053280" watchObservedRunningTime="2025-10-01 12:54:03.151592762 +0000 UTC m=+1021.472947609" Oct 01 12:54:03 crc kubenswrapper[4727]: I1001 12:54:03.763531 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.056084 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-75b5d456dc-grn5w"] Oct 01 12:54:04 crc kubenswrapper[4727]: E1001 12:54:04.056627 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f720599a-1317-44ee-a6c4-72581187a9ad" containerName="init" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.056644 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="f720599a-1317-44ee-a6c4-72581187a9ad" containerName="init" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.056864 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="f720599a-1317-44ee-a6c4-72581187a9ad" containerName="init" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.058164 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75b5d456dc-grn5w" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.062505 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.062858 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.068698 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1c67043-5e23-4c7a-92b8-b7d1513f1392-public-tls-certs\") pod \"neutron-75b5d456dc-grn5w\" (UID: \"e1c67043-5e23-4c7a-92b8-b7d1513f1392\") " pod="openstack/neutron-75b5d456dc-grn5w" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.068760 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvlc7\" (UniqueName: \"kubernetes.io/projected/e1c67043-5e23-4c7a-92b8-b7d1513f1392-kube-api-access-qvlc7\") pod \"neutron-75b5d456dc-grn5w\" (UID: \"e1c67043-5e23-4c7a-92b8-b7d1513f1392\") " pod="openstack/neutron-75b5d456dc-grn5w" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.068787 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1c67043-5e23-4c7a-92b8-b7d1513f1392-ovndb-tls-certs\") pod \"neutron-75b5d456dc-grn5w\" (UID: \"e1c67043-5e23-4c7a-92b8-b7d1513f1392\") " pod="openstack/neutron-75b5d456dc-grn5w" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.068851 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e1c67043-5e23-4c7a-92b8-b7d1513f1392-httpd-config\") pod \"neutron-75b5d456dc-grn5w\" (UID: \"e1c67043-5e23-4c7a-92b8-b7d1513f1392\") " pod="openstack/neutron-75b5d456dc-grn5w" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.068907 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1c67043-5e23-4c7a-92b8-b7d1513f1392-config\") pod \"neutron-75b5d456dc-grn5w\" (UID: \"e1c67043-5e23-4c7a-92b8-b7d1513f1392\") " pod="openstack/neutron-75b5d456dc-grn5w" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.068964 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c67043-5e23-4c7a-92b8-b7d1513f1392-combined-ca-bundle\") pod \"neutron-75b5d456dc-grn5w\" (UID: \"e1c67043-5e23-4c7a-92b8-b7d1513f1392\") " pod="openstack/neutron-75b5d456dc-grn5w" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.068981 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1c67043-5e23-4c7a-92b8-b7d1513f1392-internal-tls-certs\") pod \"neutron-75b5d456dc-grn5w\" (UID: \"e1c67043-5e23-4c7a-92b8-b7d1513f1392\") " pod="openstack/neutron-75b5d456dc-grn5w" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.073402 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-75b5d456dc-grn5w"] Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.139846 4727 generic.go:334] "Generic (PLEG): container finished" podID="810e3b5f-3968-469f-8a31-a5426587d78a" containerID="ce87f31ed80d5006851eee2237f3b16ab754bbff8c4d489a051b218708b6c875" exitCode=0 Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.139896 4727 generic.go:334] "Generic (PLEG): container finished" podID="810e3b5f-3968-469f-8a31-a5426587d78a" containerID="345764f7582ecbaa73cc3dbdd8ce2b3a94cd865bc970655933456adb8026a488" exitCode=2 Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.139909 4727 generic.go:334] "Generic (PLEG): container finished" podID="810e3b5f-3968-469f-8a31-a5426587d78a" containerID="f42b4535112d4770d0f0555d861bf11da4e6eb8063f6a8c64b028dfbbe8af7e0" exitCode=0 Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.140027 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"810e3b5f-3968-469f-8a31-a5426587d78a","Type":"ContainerDied","Data":"ce87f31ed80d5006851eee2237f3b16ab754bbff8c4d489a051b218708b6c875"} Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.140066 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"810e3b5f-3968-469f-8a31-a5426587d78a","Type":"ContainerDied","Data":"345764f7582ecbaa73cc3dbdd8ce2b3a94cd865bc970655933456adb8026a488"} Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.140082 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"810e3b5f-3968-469f-8a31-a5426587d78a","Type":"ContainerDied","Data":"f42b4535112d4770d0f0555d861bf11da4e6eb8063f6a8c64b028dfbbe8af7e0"} Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.155848 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ab0a0781-bd35-4809-a999-fb592655bcc5","Type":"ContainerStarted","Data":"299e84c82b6e7d9a17f56d4a95cdb1f7b6dad86e387f08e60398703f76b63fc7"} Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.156978 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.162141 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"edc05528-2111-4383-b904-8ad44aaa0a11","Type":"ContainerStarted","Data":"b524ea1a6d9190ce21bc32ec99b52aa00b9c006ce26c99420f2c6fa628f5d87c"} Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.170384 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1c67043-5e23-4c7a-92b8-b7d1513f1392-public-tls-certs\") pod \"neutron-75b5d456dc-grn5w\" (UID: \"e1c67043-5e23-4c7a-92b8-b7d1513f1392\") " pod="openstack/neutron-75b5d456dc-grn5w" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.170473 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvlc7\" (UniqueName: \"kubernetes.io/projected/e1c67043-5e23-4c7a-92b8-b7d1513f1392-kube-api-access-qvlc7\") pod \"neutron-75b5d456dc-grn5w\" (UID: \"e1c67043-5e23-4c7a-92b8-b7d1513f1392\") " pod="openstack/neutron-75b5d456dc-grn5w" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.170526 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1c67043-5e23-4c7a-92b8-b7d1513f1392-ovndb-tls-certs\") pod \"neutron-75b5d456dc-grn5w\" (UID: \"e1c67043-5e23-4c7a-92b8-b7d1513f1392\") " pod="openstack/neutron-75b5d456dc-grn5w" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.170569 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e1c67043-5e23-4c7a-92b8-b7d1513f1392-httpd-config\") pod \"neutron-75b5d456dc-grn5w\" (UID: \"e1c67043-5e23-4c7a-92b8-b7d1513f1392\") " pod="openstack/neutron-75b5d456dc-grn5w" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.170672 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1c67043-5e23-4c7a-92b8-b7d1513f1392-config\") pod \"neutron-75b5d456dc-grn5w\" (UID: \"e1c67043-5e23-4c7a-92b8-b7d1513f1392\") " pod="openstack/neutron-75b5d456dc-grn5w" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.170774 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c67043-5e23-4c7a-92b8-b7d1513f1392-combined-ca-bundle\") pod \"neutron-75b5d456dc-grn5w\" (UID: \"e1c67043-5e23-4c7a-92b8-b7d1513f1392\") " pod="openstack/neutron-75b5d456dc-grn5w" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.170800 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1c67043-5e23-4c7a-92b8-b7d1513f1392-internal-tls-certs\") pod \"neutron-75b5d456dc-grn5w\" (UID: \"e1c67043-5e23-4c7a-92b8-b7d1513f1392\") " pod="openstack/neutron-75b5d456dc-grn5w" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.187199 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1c67043-5e23-4c7a-92b8-b7d1513f1392-config\") pod \"neutron-75b5d456dc-grn5w\" (UID: \"e1c67043-5e23-4c7a-92b8-b7d1513f1392\") " pod="openstack/neutron-75b5d456dc-grn5w" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.187535 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1c67043-5e23-4c7a-92b8-b7d1513f1392-public-tls-certs\") pod \"neutron-75b5d456dc-grn5w\" (UID: \"e1c67043-5e23-4c7a-92b8-b7d1513f1392\") " pod="openstack/neutron-75b5d456dc-grn5w" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.192091 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1c67043-5e23-4c7a-92b8-b7d1513f1392-internal-tls-certs\") pod \"neutron-75b5d456dc-grn5w\" (UID: \"e1c67043-5e23-4c7a-92b8-b7d1513f1392\") " pod="openstack/neutron-75b5d456dc-grn5w" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.194118 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c67043-5e23-4c7a-92b8-b7d1513f1392-combined-ca-bundle\") pod \"neutron-75b5d456dc-grn5w\" (UID: \"e1c67043-5e23-4c7a-92b8-b7d1513f1392\") " pod="openstack/neutron-75b5d456dc-grn5w" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.198468 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1c67043-5e23-4c7a-92b8-b7d1513f1392-ovndb-tls-certs\") pod \"neutron-75b5d456dc-grn5w\" (UID: \"e1c67043-5e23-4c7a-92b8-b7d1513f1392\") " pod="openstack/neutron-75b5d456dc-grn5w" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.217080 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e1c67043-5e23-4c7a-92b8-b7d1513f1392-httpd-config\") pod \"neutron-75b5d456dc-grn5w\" (UID: \"e1c67043-5e23-4c7a-92b8-b7d1513f1392\") " pod="openstack/neutron-75b5d456dc-grn5w" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.232518 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvlc7\" (UniqueName: \"kubernetes.io/projected/e1c67043-5e23-4c7a-92b8-b7d1513f1392-kube-api-access-qvlc7\") pod \"neutron-75b5d456dc-grn5w\" (UID: \"e1c67043-5e23-4c7a-92b8-b7d1513f1392\") " pod="openstack/neutron-75b5d456dc-grn5w" Oct 01 12:54:04 crc kubenswrapper[4727]: I1001 12:54:04.394537 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75b5d456dc-grn5w" Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.207815 4727 generic.go:334] "Generic (PLEG): container finished" podID="810e3b5f-3968-469f-8a31-a5426587d78a" containerID="0ab3c8007acb4882f7977db1196ab7951d07b6eab131768f2137ebbdf531d39e" exitCode=0 Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.208100 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"810e3b5f-3968-469f-8a31-a5426587d78a","Type":"ContainerDied","Data":"0ab3c8007acb4882f7977db1196ab7951d07b6eab131768f2137ebbdf531d39e"} Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.227844 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.227819215 podStartE2EDuration="6.227819215s" podCreationTimestamp="2025-10-01 12:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:04.216261933 +0000 UTC m=+1022.537616780" watchObservedRunningTime="2025-10-01 12:54:05.227819215 +0000 UTC m=+1023.549174092" Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.234385 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-75b5d456dc-grn5w"] Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.277574 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ab0a0781-bd35-4809-a999-fb592655bcc5" containerName="cinder-api-log" containerID="cri-o://3de682e51bde436d01b06b04ae16f153879284d191cbb8f0440e78f1f577cfb4" gracePeriod=30 Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.278205 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ab0a0781-bd35-4809-a999-fb592655bcc5" containerName="cinder-api" containerID="cri-o://299e84c82b6e7d9a17f56d4a95cdb1f7b6dad86e387f08e60398703f76b63fc7" gracePeriod=30 Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.278293 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"edc05528-2111-4383-b904-8ad44aaa0a11","Type":"ContainerStarted","Data":"308149e451fba753c6efce492fb50cbdee2025cd972b0225b81814bca3152252"} Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.324152 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.372795 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.072832193 podStartE2EDuration="6.372770404s" podCreationTimestamp="2025-10-01 12:53:59 +0000 UTC" firstStartedPulling="2025-10-01 12:54:01.280342263 +0000 UTC m=+1019.601697110" lastFinishedPulling="2025-10-01 12:54:02.580280484 +0000 UTC m=+1020.901635321" observedRunningTime="2025-10-01 12:54:05.310324093 +0000 UTC m=+1023.631678940" watchObservedRunningTime="2025-10-01 12:54:05.372770404 +0000 UTC m=+1023.694125261" Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.680216 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.842299 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/810e3b5f-3968-469f-8a31-a5426587d78a-sg-core-conf-yaml\") pod \"810e3b5f-3968-469f-8a31-a5426587d78a\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.843494 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/810e3b5f-3968-469f-8a31-a5426587d78a-log-httpd\") pod \"810e3b5f-3968-469f-8a31-a5426587d78a\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.843726 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/810e3b5f-3968-469f-8a31-a5426587d78a-config-data\") pod \"810e3b5f-3968-469f-8a31-a5426587d78a\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.843897 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/810e3b5f-3968-469f-8a31-a5426587d78a-run-httpd\") pod \"810e3b5f-3968-469f-8a31-a5426587d78a\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.844232 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/810e3b5f-3968-469f-8a31-a5426587d78a-scripts\") pod \"810e3b5f-3968-469f-8a31-a5426587d78a\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.844483 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/810e3b5f-3968-469f-8a31-a5426587d78a-combined-ca-bundle\") pod \"810e3b5f-3968-469f-8a31-a5426587d78a\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.844793 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5qlq\" (UniqueName: \"kubernetes.io/projected/810e3b5f-3968-469f-8a31-a5426587d78a-kube-api-access-m5qlq\") pod \"810e3b5f-3968-469f-8a31-a5426587d78a\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.844285 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/810e3b5f-3968-469f-8a31-a5426587d78a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "810e3b5f-3968-469f-8a31-a5426587d78a" (UID: "810e3b5f-3968-469f-8a31-a5426587d78a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.846984 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/810e3b5f-3968-469f-8a31-a5426587d78a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "810e3b5f-3968-469f-8a31-a5426587d78a" (UID: "810e3b5f-3968-469f-8a31-a5426587d78a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.858583 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/810e3b5f-3968-469f-8a31-a5426587d78a-kube-api-access-m5qlq" (OuterVolumeSpecName: "kube-api-access-m5qlq") pod "810e3b5f-3968-469f-8a31-a5426587d78a" (UID: "810e3b5f-3968-469f-8a31-a5426587d78a"). InnerVolumeSpecName "kube-api-access-m5qlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.861300 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810e3b5f-3968-469f-8a31-a5426587d78a-scripts" (OuterVolumeSpecName: "scripts") pod "810e3b5f-3968-469f-8a31-a5426587d78a" (UID: "810e3b5f-3968-469f-8a31-a5426587d78a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.947231 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810e3b5f-3968-469f-8a31-a5426587d78a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "810e3b5f-3968-469f-8a31-a5426587d78a" (UID: "810e3b5f-3968-469f-8a31-a5426587d78a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.947976 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/810e3b5f-3968-469f-8a31-a5426587d78a-sg-core-conf-yaml\") pod \"810e3b5f-3968-469f-8a31-a5426587d78a\" (UID: \"810e3b5f-3968-469f-8a31-a5426587d78a\") " Oct 01 12:54:05 crc kubenswrapper[4727]: W1001 12:54:05.948210 4727 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/810e3b5f-3968-469f-8a31-a5426587d78a/volumes/kubernetes.io~secret/sg-core-conf-yaml Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.948241 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810e3b5f-3968-469f-8a31-a5426587d78a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "810e3b5f-3968-469f-8a31-a5426587d78a" (UID: "810e3b5f-3968-469f-8a31-a5426587d78a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.948584 4727 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/810e3b5f-3968-469f-8a31-a5426587d78a-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.948604 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5qlq\" (UniqueName: \"kubernetes.io/projected/810e3b5f-3968-469f-8a31-a5426587d78a-kube-api-access-m5qlq\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.948618 4727 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/810e3b5f-3968-469f-8a31-a5426587d78a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.948628 4727 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/810e3b5f-3968-469f-8a31-a5426587d78a-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:05 crc kubenswrapper[4727]: I1001 12:54:05.948641 4727 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/810e3b5f-3968-469f-8a31-a5426587d78a-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.029622 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810e3b5f-3968-469f-8a31-a5426587d78a-config-data" (OuterVolumeSpecName: "config-data") pod "810e3b5f-3968-469f-8a31-a5426587d78a" (UID: "810e3b5f-3968-469f-8a31-a5426587d78a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.055943 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/810e3b5f-3968-469f-8a31-a5426587d78a-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.103962 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810e3b5f-3968-469f-8a31-a5426587d78a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "810e3b5f-3968-469f-8a31-a5426587d78a" (UID: "810e3b5f-3968-469f-8a31-a5426587d78a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.158335 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/810e3b5f-3968-469f-8a31-a5426587d78a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.237462 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.260400 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab0a0781-bd35-4809-a999-fb592655bcc5-logs\") pod \"ab0a0781-bd35-4809-a999-fb592655bcc5\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.260589 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dm9tf\" (UniqueName: \"kubernetes.io/projected/ab0a0781-bd35-4809-a999-fb592655bcc5-kube-api-access-dm9tf\") pod \"ab0a0781-bd35-4809-a999-fb592655bcc5\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.260630 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab0a0781-bd35-4809-a999-fb592655bcc5-config-data-custom\") pod \"ab0a0781-bd35-4809-a999-fb592655bcc5\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.260671 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0a0781-bd35-4809-a999-fb592655bcc5-config-data\") pod \"ab0a0781-bd35-4809-a999-fb592655bcc5\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.260740 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0a0781-bd35-4809-a999-fb592655bcc5-combined-ca-bundle\") pod \"ab0a0781-bd35-4809-a999-fb592655bcc5\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.260760 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ab0a0781-bd35-4809-a999-fb592655bcc5-etc-machine-id\") pod \"ab0a0781-bd35-4809-a999-fb592655bcc5\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.260924 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab0a0781-bd35-4809-a999-fb592655bcc5-scripts\") pod \"ab0a0781-bd35-4809-a999-fb592655bcc5\" (UID: \"ab0a0781-bd35-4809-a999-fb592655bcc5\") " Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.262849 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab0a0781-bd35-4809-a999-fb592655bcc5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ab0a0781-bd35-4809-a999-fb592655bcc5" (UID: "ab0a0781-bd35-4809-a999-fb592655bcc5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.263231 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab0a0781-bd35-4809-a999-fb592655bcc5-logs" (OuterVolumeSpecName: "logs") pod "ab0a0781-bd35-4809-a999-fb592655bcc5" (UID: "ab0a0781-bd35-4809-a999-fb592655bcc5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.265351 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab0a0781-bd35-4809-a999-fb592655bcc5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ab0a0781-bd35-4809-a999-fb592655bcc5" (UID: "ab0a0781-bd35-4809-a999-fb592655bcc5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.269329 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab0a0781-bd35-4809-a999-fb592655bcc5-kube-api-access-dm9tf" (OuterVolumeSpecName: "kube-api-access-dm9tf") pod "ab0a0781-bd35-4809-a999-fb592655bcc5" (UID: "ab0a0781-bd35-4809-a999-fb592655bcc5"). InnerVolumeSpecName "kube-api-access-dm9tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.272892 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab0a0781-bd35-4809-a999-fb592655bcc5-scripts" (OuterVolumeSpecName: "scripts") pod "ab0a0781-bd35-4809-a999-fb592655bcc5" (UID: "ab0a0781-bd35-4809-a999-fb592655bcc5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.320175 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab0a0781-bd35-4809-a999-fb592655bcc5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab0a0781-bd35-4809-a999-fb592655bcc5" (UID: "ab0a0781-bd35-4809-a999-fb592655bcc5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.325737 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75b5d456dc-grn5w" event={"ID":"e1c67043-5e23-4c7a-92b8-b7d1513f1392","Type":"ContainerStarted","Data":"5f56621ec9ca70e508d038ebbc4f36ed017150b6863093f1666919a9f4a9b119"} Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.325788 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75b5d456dc-grn5w" event={"ID":"e1c67043-5e23-4c7a-92b8-b7d1513f1392","Type":"ContainerStarted","Data":"e8aae1b3d44cc18b5ddf5bc8a7139e00969aa028aceeb1f8813e5a05010367c9"} Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.344025 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.347253 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"810e3b5f-3968-469f-8a31-a5426587d78a","Type":"ContainerDied","Data":"65fe6af4e59ca8c10befa9704b6a8325168876b53e115909dda8ed3711b9ca53"} Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.347315 4727 scope.go:117] "RemoveContainer" containerID="ce87f31ed80d5006851eee2237f3b16ab754bbff8c4d489a051b218708b6c875" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.362546 4727 generic.go:334] "Generic (PLEG): container finished" podID="ab0a0781-bd35-4809-a999-fb592655bcc5" containerID="299e84c82b6e7d9a17f56d4a95cdb1f7b6dad86e387f08e60398703f76b63fc7" exitCode=0 Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.362822 4727 generic.go:334] "Generic (PLEG): container finished" podID="ab0a0781-bd35-4809-a999-fb592655bcc5" containerID="3de682e51bde436d01b06b04ae16f153879284d191cbb8f0440e78f1f577cfb4" exitCode=143 Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.362769 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.362795 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ab0a0781-bd35-4809-a999-fb592655bcc5","Type":"ContainerDied","Data":"299e84c82b6e7d9a17f56d4a95cdb1f7b6dad86e387f08e60398703f76b63fc7"} Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.363363 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ab0a0781-bd35-4809-a999-fb592655bcc5","Type":"ContainerDied","Data":"3de682e51bde436d01b06b04ae16f153879284d191cbb8f0440e78f1f577cfb4"} Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.363432 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ab0a0781-bd35-4809-a999-fb592655bcc5","Type":"ContainerDied","Data":"5df089435d1aeb50afc84a8c030a4e2349ad3eb08e8c3f737cfc9e7c641ac1e7"} Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.363773 4727 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab0a0781-bd35-4809-a999-fb592655bcc5-logs\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.364144 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dm9tf\" (UniqueName: \"kubernetes.io/projected/ab0a0781-bd35-4809-a999-fb592655bcc5-kube-api-access-dm9tf\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.364193 4727 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab0a0781-bd35-4809-a999-fb592655bcc5-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.364209 4727 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ab0a0781-bd35-4809-a999-fb592655bcc5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.364223 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0a0781-bd35-4809-a999-fb592655bcc5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.364238 4727 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab0a0781-bd35-4809-a999-fb592655bcc5-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.462464 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-648455799b-c8jzs" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.462505 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.462522 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.462538 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:06 crc kubenswrapper[4727]: E1001 12:54:06.462924 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab0a0781-bd35-4809-a999-fb592655bcc5" containerName="cinder-api-log" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.462940 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab0a0781-bd35-4809-a999-fb592655bcc5" containerName="cinder-api-log" Oct 01 12:54:06 crc kubenswrapper[4727]: E1001 12:54:06.462956 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="810e3b5f-3968-469f-8a31-a5426587d78a" containerName="ceilometer-notification-agent" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.462964 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="810e3b5f-3968-469f-8a31-a5426587d78a" containerName="ceilometer-notification-agent" Oct 01 12:54:06 crc kubenswrapper[4727]: E1001 12:54:06.462973 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="810e3b5f-3968-469f-8a31-a5426587d78a" containerName="sg-core" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.462979 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="810e3b5f-3968-469f-8a31-a5426587d78a" containerName="sg-core" Oct 01 12:54:06 crc kubenswrapper[4727]: E1001 12:54:06.463025 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="810e3b5f-3968-469f-8a31-a5426587d78a" containerName="ceilometer-central-agent" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.463032 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="810e3b5f-3968-469f-8a31-a5426587d78a" containerName="ceilometer-central-agent" Oct 01 12:54:06 crc kubenswrapper[4727]: E1001 12:54:06.463042 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab0a0781-bd35-4809-a999-fb592655bcc5" containerName="cinder-api" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.463048 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab0a0781-bd35-4809-a999-fb592655bcc5" containerName="cinder-api" Oct 01 12:54:06 crc kubenswrapper[4727]: E1001 12:54:06.463061 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="810e3b5f-3968-469f-8a31-a5426587d78a" containerName="proxy-httpd" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.463067 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="810e3b5f-3968-469f-8a31-a5426587d78a" containerName="proxy-httpd" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.463218 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="810e3b5f-3968-469f-8a31-a5426587d78a" containerName="sg-core" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.463232 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="810e3b5f-3968-469f-8a31-a5426587d78a" containerName="ceilometer-notification-agent" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.463243 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="810e3b5f-3968-469f-8a31-a5426587d78a" containerName="ceilometer-central-agent" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.463262 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab0a0781-bd35-4809-a999-fb592655bcc5" containerName="cinder-api" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.463271 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab0a0781-bd35-4809-a999-fb592655bcc5" containerName="cinder-api-log" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.463283 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="810e3b5f-3968-469f-8a31-a5426587d78a" containerName="proxy-httpd" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.494837 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab0a0781-bd35-4809-a999-fb592655bcc5-config-data" (OuterVolumeSpecName: "config-data") pod "ab0a0781-bd35-4809-a999-fb592655bcc5" (UID: "ab0a0781-bd35-4809-a999-fb592655bcc5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.497693 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.502062 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-648455799b-c8jzs" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.519369 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.517260 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.517431 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.529963 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab0a0781-bd35-4809-a999-fb592655bcc5-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.536634 4727 scope.go:117] "RemoveContainer" containerID="345764f7582ecbaa73cc3dbdd8ce2b3a94cd865bc970655933456adb8026a488" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.625217 4727 scope.go:117] "RemoveContainer" containerID="f42b4535112d4770d0f0555d861bf11da4e6eb8063f6a8c64b028dfbbe8af7e0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.638508 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6793dc6-9887-423a-a856-e76d4cddbd83-config-data\") pod \"ceilometer-0\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " pod="openstack/ceilometer-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.638577 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b6793dc6-9887-423a-a856-e76d4cddbd83-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " pod="openstack/ceilometer-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.638600 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6793dc6-9887-423a-a856-e76d4cddbd83-scripts\") pod \"ceilometer-0\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " pod="openstack/ceilometer-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.638664 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6793dc6-9887-423a-a856-e76d4cddbd83-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " pod="openstack/ceilometer-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.638704 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjp8k\" (UniqueName: \"kubernetes.io/projected/b6793dc6-9887-423a-a856-e76d4cddbd83-kube-api-access-vjp8k\") pod \"ceilometer-0\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " pod="openstack/ceilometer-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.638729 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6793dc6-9887-423a-a856-e76d4cddbd83-run-httpd\") pod \"ceilometer-0\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " pod="openstack/ceilometer-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.638762 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6793dc6-9887-423a-a856-e76d4cddbd83-log-httpd\") pod \"ceilometer-0\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " pod="openstack/ceilometer-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.668801 4727 scope.go:117] "RemoveContainer" containerID="0ab3c8007acb4882f7977db1196ab7951d07b6eab131768f2137ebbdf531d39e" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.741099 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6793dc6-9887-423a-a856-e76d4cddbd83-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " pod="openstack/ceilometer-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.741175 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjp8k\" (UniqueName: \"kubernetes.io/projected/b6793dc6-9887-423a-a856-e76d4cddbd83-kube-api-access-vjp8k\") pod \"ceilometer-0\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " pod="openstack/ceilometer-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.741213 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6793dc6-9887-423a-a856-e76d4cddbd83-run-httpd\") pod \"ceilometer-0\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " pod="openstack/ceilometer-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.741240 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6793dc6-9887-423a-a856-e76d4cddbd83-log-httpd\") pod \"ceilometer-0\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " pod="openstack/ceilometer-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.741418 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6793dc6-9887-423a-a856-e76d4cddbd83-config-data\") pod \"ceilometer-0\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " pod="openstack/ceilometer-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.741453 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b6793dc6-9887-423a-a856-e76d4cddbd83-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " pod="openstack/ceilometer-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.741477 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6793dc6-9887-423a-a856-e76d4cddbd83-scripts\") pod \"ceilometer-0\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " pod="openstack/ceilometer-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.745111 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6793dc6-9887-423a-a856-e76d4cddbd83-run-httpd\") pod \"ceilometer-0\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " pod="openstack/ceilometer-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.746462 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.747554 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6793dc6-9887-423a-a856-e76d4cddbd83-log-httpd\") pod \"ceilometer-0\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " pod="openstack/ceilometer-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.753010 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6793dc6-9887-423a-a856-e76d4cddbd83-scripts\") pod \"ceilometer-0\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " pod="openstack/ceilometer-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.754784 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6793dc6-9887-423a-a856-e76d4cddbd83-config-data\") pod \"ceilometer-0\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " pod="openstack/ceilometer-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.754876 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b6793dc6-9887-423a-a856-e76d4cddbd83-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " pod="openstack/ceilometer-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.761739 4727 scope.go:117] "RemoveContainer" containerID="299e84c82b6e7d9a17f56d4a95cdb1f7b6dad86e387f08e60398703f76b63fc7" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.763507 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6793dc6-9887-423a-a856-e76d4cddbd83-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " pod="openstack/ceilometer-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.784450 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjp8k\" (UniqueName: \"kubernetes.io/projected/b6793dc6-9887-423a-a856-e76d4cddbd83-kube-api-access-vjp8k\") pod \"ceilometer-0\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " pod="openstack/ceilometer-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.801235 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.807487 4727 scope.go:117] "RemoveContainer" containerID="3de682e51bde436d01b06b04ae16f153879284d191cbb8f0440e78f1f577cfb4" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.814791 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.816882 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.820151 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.820643 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.821047 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.826839 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.874549 4727 scope.go:117] "RemoveContainer" containerID="299e84c82b6e7d9a17f56d4a95cdb1f7b6dad86e387f08e60398703f76b63fc7" Oct 01 12:54:06 crc kubenswrapper[4727]: E1001 12:54:06.879525 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"299e84c82b6e7d9a17f56d4a95cdb1f7b6dad86e387f08e60398703f76b63fc7\": container with ID starting with 299e84c82b6e7d9a17f56d4a95cdb1f7b6dad86e387f08e60398703f76b63fc7 not found: ID does not exist" containerID="299e84c82b6e7d9a17f56d4a95cdb1f7b6dad86e387f08e60398703f76b63fc7" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.879674 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"299e84c82b6e7d9a17f56d4a95cdb1f7b6dad86e387f08e60398703f76b63fc7"} err="failed to get container status \"299e84c82b6e7d9a17f56d4a95cdb1f7b6dad86e387f08e60398703f76b63fc7\": rpc error: code = NotFound desc = could not find container \"299e84c82b6e7d9a17f56d4a95cdb1f7b6dad86e387f08e60398703f76b63fc7\": container with ID starting with 299e84c82b6e7d9a17f56d4a95cdb1f7b6dad86e387f08e60398703f76b63fc7 not found: ID does not exist" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.879717 4727 scope.go:117] "RemoveContainer" containerID="3de682e51bde436d01b06b04ae16f153879284d191cbb8f0440e78f1f577cfb4" Oct 01 12:54:06 crc kubenswrapper[4727]: E1001 12:54:06.883261 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3de682e51bde436d01b06b04ae16f153879284d191cbb8f0440e78f1f577cfb4\": container with ID starting with 3de682e51bde436d01b06b04ae16f153879284d191cbb8f0440e78f1f577cfb4 not found: ID does not exist" containerID="3de682e51bde436d01b06b04ae16f153879284d191cbb8f0440e78f1f577cfb4" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.883320 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de682e51bde436d01b06b04ae16f153879284d191cbb8f0440e78f1f577cfb4"} err="failed to get container status \"3de682e51bde436d01b06b04ae16f153879284d191cbb8f0440e78f1f577cfb4\": rpc error: code = NotFound desc = could not find container \"3de682e51bde436d01b06b04ae16f153879284d191cbb8f0440e78f1f577cfb4\": container with ID starting with 3de682e51bde436d01b06b04ae16f153879284d191cbb8f0440e78f1f577cfb4 not found: ID does not exist" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.883358 4727 scope.go:117] "RemoveContainer" containerID="299e84c82b6e7d9a17f56d4a95cdb1f7b6dad86e387f08e60398703f76b63fc7" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.884578 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"299e84c82b6e7d9a17f56d4a95cdb1f7b6dad86e387f08e60398703f76b63fc7"} err="failed to get container status \"299e84c82b6e7d9a17f56d4a95cdb1f7b6dad86e387f08e60398703f76b63fc7\": rpc error: code = NotFound desc = could not find container \"299e84c82b6e7d9a17f56d4a95cdb1f7b6dad86e387f08e60398703f76b63fc7\": container with ID starting with 299e84c82b6e7d9a17f56d4a95cdb1f7b6dad86e387f08e60398703f76b63fc7 not found: ID does not exist" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.884644 4727 scope.go:117] "RemoveContainer" containerID="3de682e51bde436d01b06b04ae16f153879284d191cbb8f0440e78f1f577cfb4" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.884912 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de682e51bde436d01b06b04ae16f153879284d191cbb8f0440e78f1f577cfb4"} err="failed to get container status \"3de682e51bde436d01b06b04ae16f153879284d191cbb8f0440e78f1f577cfb4\": rpc error: code = NotFound desc = could not find container \"3de682e51bde436d01b06b04ae16f153879284d191cbb8f0440e78f1f577cfb4\": container with ID starting with 3de682e51bde436d01b06b04ae16f153879284d191cbb8f0440e78f1f577cfb4 not found: ID does not exist" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.892123 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.947398 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a88e6f-bc10-4121-a28a-e9f1ae533e6a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"73a88e6f-bc10-4121-a28a-e9f1ae533e6a\") " pod="openstack/cinder-api-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.947589 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a88e6f-bc10-4121-a28a-e9f1ae533e6a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"73a88e6f-bc10-4121-a28a-e9f1ae533e6a\") " pod="openstack/cinder-api-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.947719 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73a88e6f-bc10-4121-a28a-e9f1ae533e6a-config-data\") pod \"cinder-api-0\" (UID: \"73a88e6f-bc10-4121-a28a-e9f1ae533e6a\") " pod="openstack/cinder-api-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.947745 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a88e6f-bc10-4121-a28a-e9f1ae533e6a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"73a88e6f-bc10-4121-a28a-e9f1ae533e6a\") " pod="openstack/cinder-api-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.947777 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73a88e6f-bc10-4121-a28a-e9f1ae533e6a-scripts\") pod \"cinder-api-0\" (UID: \"73a88e6f-bc10-4121-a28a-e9f1ae533e6a\") " pod="openstack/cinder-api-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.947834 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73a88e6f-bc10-4121-a28a-e9f1ae533e6a-logs\") pod \"cinder-api-0\" (UID: \"73a88e6f-bc10-4121-a28a-e9f1ae533e6a\") " pod="openstack/cinder-api-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.947917 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73a88e6f-bc10-4121-a28a-e9f1ae533e6a-config-data-custom\") pod \"cinder-api-0\" (UID: \"73a88e6f-bc10-4121-a28a-e9f1ae533e6a\") " pod="openstack/cinder-api-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.948136 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv6f4\" (UniqueName: \"kubernetes.io/projected/73a88e6f-bc10-4121-a28a-e9f1ae533e6a-kube-api-access-tv6f4\") pod \"cinder-api-0\" (UID: \"73a88e6f-bc10-4121-a28a-e9f1ae533e6a\") " pod="openstack/cinder-api-0" Oct 01 12:54:06 crc kubenswrapper[4727]: I1001 12:54:06.948185 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/73a88e6f-bc10-4121-a28a-e9f1ae533e6a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"73a88e6f-bc10-4121-a28a-e9f1ae533e6a\") " pod="openstack/cinder-api-0" Oct 01 12:54:07 crc kubenswrapper[4727]: I1001 12:54:07.050405 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73a88e6f-bc10-4121-a28a-e9f1ae533e6a-config-data\") pod \"cinder-api-0\" (UID: \"73a88e6f-bc10-4121-a28a-e9f1ae533e6a\") " pod="openstack/cinder-api-0" Oct 01 12:54:07 crc kubenswrapper[4727]: I1001 12:54:07.050878 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a88e6f-bc10-4121-a28a-e9f1ae533e6a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"73a88e6f-bc10-4121-a28a-e9f1ae533e6a\") " pod="openstack/cinder-api-0" Oct 01 12:54:07 crc kubenswrapper[4727]: I1001 12:54:07.050914 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73a88e6f-bc10-4121-a28a-e9f1ae533e6a-scripts\") pod \"cinder-api-0\" (UID: \"73a88e6f-bc10-4121-a28a-e9f1ae533e6a\") " pod="openstack/cinder-api-0" Oct 01 12:54:07 crc kubenswrapper[4727]: I1001 12:54:07.050937 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73a88e6f-bc10-4121-a28a-e9f1ae533e6a-logs\") pod \"cinder-api-0\" (UID: \"73a88e6f-bc10-4121-a28a-e9f1ae533e6a\") " pod="openstack/cinder-api-0" Oct 01 12:54:07 crc kubenswrapper[4727]: I1001 12:54:07.050980 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73a88e6f-bc10-4121-a28a-e9f1ae533e6a-config-data-custom\") pod \"cinder-api-0\" (UID: \"73a88e6f-bc10-4121-a28a-e9f1ae533e6a\") " pod="openstack/cinder-api-0" Oct 01 12:54:07 crc kubenswrapper[4727]: I1001 12:54:07.051052 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv6f4\" (UniqueName: \"kubernetes.io/projected/73a88e6f-bc10-4121-a28a-e9f1ae533e6a-kube-api-access-tv6f4\") pod \"cinder-api-0\" (UID: \"73a88e6f-bc10-4121-a28a-e9f1ae533e6a\") " pod="openstack/cinder-api-0" Oct 01 12:54:07 crc kubenswrapper[4727]: I1001 12:54:07.051090 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/73a88e6f-bc10-4121-a28a-e9f1ae533e6a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"73a88e6f-bc10-4121-a28a-e9f1ae533e6a\") " pod="openstack/cinder-api-0" Oct 01 12:54:07 crc kubenswrapper[4727]: I1001 12:54:07.051179 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a88e6f-bc10-4121-a28a-e9f1ae533e6a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"73a88e6f-bc10-4121-a28a-e9f1ae533e6a\") " pod="openstack/cinder-api-0" Oct 01 12:54:07 crc kubenswrapper[4727]: I1001 12:54:07.051205 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a88e6f-bc10-4121-a28a-e9f1ae533e6a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"73a88e6f-bc10-4121-a28a-e9f1ae533e6a\") " pod="openstack/cinder-api-0" Oct 01 12:54:07 crc kubenswrapper[4727]: I1001 12:54:07.056601 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/73a88e6f-bc10-4121-a28a-e9f1ae533e6a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"73a88e6f-bc10-4121-a28a-e9f1ae533e6a\") " pod="openstack/cinder-api-0" Oct 01 12:54:07 crc kubenswrapper[4727]: I1001 12:54:07.056834 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73a88e6f-bc10-4121-a28a-e9f1ae533e6a-logs\") pod \"cinder-api-0\" (UID: \"73a88e6f-bc10-4121-a28a-e9f1ae533e6a\") " pod="openstack/cinder-api-0" Oct 01 12:54:07 crc kubenswrapper[4727]: I1001 12:54:07.058459 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73a88e6f-bc10-4121-a28a-e9f1ae533e6a-config-data\") pod \"cinder-api-0\" (UID: \"73a88e6f-bc10-4121-a28a-e9f1ae533e6a\") " pod="openstack/cinder-api-0" Oct 01 12:54:07 crc kubenswrapper[4727]: I1001 12:54:07.065925 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73a88e6f-bc10-4121-a28a-e9f1ae533e6a-config-data-custom\") pod \"cinder-api-0\" (UID: \"73a88e6f-bc10-4121-a28a-e9f1ae533e6a\") " pod="openstack/cinder-api-0" Oct 01 12:54:07 crc kubenswrapper[4727]: I1001 12:54:07.068676 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73a88e6f-bc10-4121-a28a-e9f1ae533e6a-scripts\") pod \"cinder-api-0\" (UID: \"73a88e6f-bc10-4121-a28a-e9f1ae533e6a\") " pod="openstack/cinder-api-0" Oct 01 12:54:07 crc kubenswrapper[4727]: I1001 12:54:07.076234 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a88e6f-bc10-4121-a28a-e9f1ae533e6a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"73a88e6f-bc10-4121-a28a-e9f1ae533e6a\") " pod="openstack/cinder-api-0" Oct 01 12:54:07 crc kubenswrapper[4727]: I1001 12:54:07.076775 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a88e6f-bc10-4121-a28a-e9f1ae533e6a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"73a88e6f-bc10-4121-a28a-e9f1ae533e6a\") " pod="openstack/cinder-api-0" Oct 01 12:54:07 crc kubenswrapper[4727]: I1001 12:54:07.083919 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv6f4\" (UniqueName: \"kubernetes.io/projected/73a88e6f-bc10-4121-a28a-e9f1ae533e6a-kube-api-access-tv6f4\") pod \"cinder-api-0\" (UID: \"73a88e6f-bc10-4121-a28a-e9f1ae533e6a\") " pod="openstack/cinder-api-0" Oct 01 12:54:07 crc kubenswrapper[4727]: I1001 12:54:07.100892 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a88e6f-bc10-4121-a28a-e9f1ae533e6a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"73a88e6f-bc10-4121-a28a-e9f1ae533e6a\") " pod="openstack/cinder-api-0" Oct 01 12:54:07 crc kubenswrapper[4727]: I1001 12:54:07.184577 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 12:54:07 crc kubenswrapper[4727]: I1001 12:54:07.383810 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75b5d456dc-grn5w" event={"ID":"e1c67043-5e23-4c7a-92b8-b7d1513f1392","Type":"ContainerStarted","Data":"fb6268ee789e3bab52bf596103b507f4615450a94ea802523e93595c49a7f46f"} Oct 01 12:54:07 crc kubenswrapper[4727]: I1001 12:54:07.383959 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-75b5d456dc-grn5w" Oct 01 12:54:07 crc kubenswrapper[4727]: I1001 12:54:07.434741 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-75b5d456dc-grn5w" podStartSLOduration=3.434716801 podStartE2EDuration="3.434716801s" podCreationTimestamp="2025-10-01 12:54:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:07.417506984 +0000 UTC m=+1025.738861821" watchObservedRunningTime="2025-10-01 12:54:07.434716801 +0000 UTC m=+1025.756071638" Oct 01 12:54:07 crc kubenswrapper[4727]: I1001 12:54:07.477622 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:07 crc kubenswrapper[4727]: I1001 12:54:07.787963 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 12:54:08 crc kubenswrapper[4727]: I1001 12:54:08.392825 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="810e3b5f-3968-469f-8a31-a5426587d78a" path="/var/lib/kubelet/pods/810e3b5f-3968-469f-8a31-a5426587d78a/volumes" Oct 01 12:54:08 crc kubenswrapper[4727]: I1001 12:54:08.395364 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab0a0781-bd35-4809-a999-fb592655bcc5" path="/var/lib/kubelet/pods/ab0a0781-bd35-4809-a999-fb592655bcc5/volumes" Oct 01 12:54:08 crc kubenswrapper[4727]: I1001 12:54:08.426622 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"73a88e6f-bc10-4121-a28a-e9f1ae533e6a","Type":"ContainerStarted","Data":"088e9beea486aab50b71e5be29f3f8029ba538e92163f6f9238c64c4316e5e1e"} Oct 01 12:54:08 crc kubenswrapper[4727]: I1001 12:54:08.433944 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6793dc6-9887-423a-a856-e76d4cddbd83","Type":"ContainerStarted","Data":"a867dea50456c905af7cca369f32a5f90a125576639706e1f9cf68035524a09c"} Oct 01 12:54:08 crc kubenswrapper[4727]: I1001 12:54:08.433982 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6793dc6-9887-423a-a856-e76d4cddbd83","Type":"ContainerStarted","Data":"439bf06e39eee40d1f3a7f86da991c1039c9b565a1243eae8c9df8c6c56b0c5e"} Oct 01 12:54:09 crc kubenswrapper[4727]: I1001 12:54:09.414768 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:54:09 crc kubenswrapper[4727]: I1001 12:54:09.418864 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7df68f6869-rwfcm" Oct 01 12:54:09 crc kubenswrapper[4727]: I1001 12:54:09.498935 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"73a88e6f-bc10-4121-a28a-e9f1ae533e6a","Type":"ContainerStarted","Data":"3d99cae8f9cb243fc3adc8ca8460ffb81b940edc6b213859aac70a64f234117b"} Oct 01 12:54:10 crc kubenswrapper[4727]: I1001 12:54:10.389506 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-thhck" Oct 01 12:54:10 crc kubenswrapper[4727]: I1001 12:54:10.450931 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-68dh7"] Oct 01 12:54:10 crc kubenswrapper[4727]: I1001 12:54:10.451189 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" podUID="17dd087d-9162-4d17-84fb-49bbcb2c542e" containerName="dnsmasq-dns" containerID="cri-o://0dd2e0358359e27c18cb511307a5dcb144cee348103f4b04daa1e2e01f92af05" gracePeriod=10 Oct 01 12:54:10 crc kubenswrapper[4727]: I1001 12:54:10.534171 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6793dc6-9887-423a-a856-e76d4cddbd83","Type":"ContainerStarted","Data":"f1819b525c6b6d9d122ce6fa459d6d9094ec7a8228e9b074c010d19fe5d89509"} Oct 01 12:54:10 crc kubenswrapper[4727]: I1001 12:54:10.534229 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6793dc6-9887-423a-a856-e76d4cddbd83","Type":"ContainerStarted","Data":"d0b1e46d2bbb802fdb455bc8022840f877301defd64578c7979ca7055dfa92c8"} Oct 01 12:54:10 crc kubenswrapper[4727]: I1001 12:54:10.545811 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"73a88e6f-bc10-4121-a28a-e9f1ae533e6a","Type":"ContainerStarted","Data":"274a9c8bfb45d15afa8c9feabb327dd7664fb4af425acd86b6a0c964a52e2a01"} Oct 01 12:54:10 crc kubenswrapper[4727]: I1001 12:54:10.546056 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 01 12:54:10 crc kubenswrapper[4727]: I1001 12:54:10.603720 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.603703423 podStartE2EDuration="4.603703423s" podCreationTimestamp="2025-10-01 12:54:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:10.603428035 +0000 UTC m=+1028.924782882" watchObservedRunningTime="2025-10-01 12:54:10.603703423 +0000 UTC m=+1028.925058260" Oct 01 12:54:10 crc kubenswrapper[4727]: I1001 12:54:10.605709 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" podUID="17dd087d-9162-4d17-84fb-49bbcb2c542e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.151:5353: connect: connection refused" Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.063137 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.126327 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.291821 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.356965 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g948\" (UniqueName: \"kubernetes.io/projected/17dd087d-9162-4d17-84fb-49bbcb2c542e-kube-api-access-4g948\") pod \"17dd087d-9162-4d17-84fb-49bbcb2c542e\" (UID: \"17dd087d-9162-4d17-84fb-49bbcb2c542e\") " Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.357109 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-config\") pod \"17dd087d-9162-4d17-84fb-49bbcb2c542e\" (UID: \"17dd087d-9162-4d17-84fb-49bbcb2c542e\") " Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.357212 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-ovsdbserver-nb\") pod \"17dd087d-9162-4d17-84fb-49bbcb2c542e\" (UID: \"17dd087d-9162-4d17-84fb-49bbcb2c542e\") " Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.357252 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-ovsdbserver-sb\") pod \"17dd087d-9162-4d17-84fb-49bbcb2c542e\" (UID: \"17dd087d-9162-4d17-84fb-49bbcb2c542e\") " Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.357339 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-dns-svc\") pod \"17dd087d-9162-4d17-84fb-49bbcb2c542e\" (UID: \"17dd087d-9162-4d17-84fb-49bbcb2c542e\") " Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.357407 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-dns-swift-storage-0\") pod \"17dd087d-9162-4d17-84fb-49bbcb2c542e\" (UID: \"17dd087d-9162-4d17-84fb-49bbcb2c542e\") " Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.369497 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17dd087d-9162-4d17-84fb-49bbcb2c542e-kube-api-access-4g948" (OuterVolumeSpecName: "kube-api-access-4g948") pod "17dd087d-9162-4d17-84fb-49bbcb2c542e" (UID: "17dd087d-9162-4d17-84fb-49bbcb2c542e"). InnerVolumeSpecName "kube-api-access-4g948". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.405502 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.446857 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-config" (OuterVolumeSpecName: "config") pod "17dd087d-9162-4d17-84fb-49bbcb2c542e" (UID: "17dd087d-9162-4d17-84fb-49bbcb2c542e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.450316 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "17dd087d-9162-4d17-84fb-49bbcb2c542e" (UID: "17dd087d-9162-4d17-84fb-49bbcb2c542e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.460474 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g948\" (UniqueName: \"kubernetes.io/projected/17dd087d-9162-4d17-84fb-49bbcb2c542e-kube-api-access-4g948\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.461582 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.461710 4727 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.480539 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "17dd087d-9162-4d17-84fb-49bbcb2c542e" (UID: "17dd087d-9162-4d17-84fb-49bbcb2c542e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.501701 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "17dd087d-9162-4d17-84fb-49bbcb2c542e" (UID: "17dd087d-9162-4d17-84fb-49bbcb2c542e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.509882 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "17dd087d-9162-4d17-84fb-49bbcb2c542e" (UID: "17dd087d-9162-4d17-84fb-49bbcb2c542e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.563364 4727 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.563399 4727 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.563413 4727 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17dd087d-9162-4d17-84fb-49bbcb2c542e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.574194 4727 generic.go:334] "Generic (PLEG): container finished" podID="17dd087d-9162-4d17-84fb-49bbcb2c542e" containerID="0dd2e0358359e27c18cb511307a5dcb144cee348103f4b04daa1e2e01f92af05" exitCode=0 Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.574294 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" event={"ID":"17dd087d-9162-4d17-84fb-49bbcb2c542e","Type":"ContainerDied","Data":"0dd2e0358359e27c18cb511307a5dcb144cee348103f4b04daa1e2e01f92af05"} Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.574375 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" event={"ID":"17dd087d-9162-4d17-84fb-49bbcb2c542e","Type":"ContainerDied","Data":"1e72a2a03f34ebf7fc27501c99ee3e188fc1668af4a9adfe8d5ecf4f5c5510fc"} Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.574329 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-68dh7" Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.574402 4727 scope.go:117] "RemoveContainer" containerID="0dd2e0358359e27c18cb511307a5dcb144cee348103f4b04daa1e2e01f92af05" Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.574426 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="edc05528-2111-4383-b904-8ad44aaa0a11" containerName="cinder-scheduler" containerID="cri-o://b524ea1a6d9190ce21bc32ec99b52aa00b9c006ce26c99420f2c6fa628f5d87c" gracePeriod=30 Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.574517 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="edc05528-2111-4383-b904-8ad44aaa0a11" containerName="probe" containerID="cri-o://308149e451fba753c6efce492fb50cbdee2025cd972b0225b81814bca3152252" gracePeriod=30 Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.620214 4727 scope.go:117] "RemoveContainer" containerID="654162fd7d0169b4404daeedafcbaf7ed4dc9a4e2f955bb2228cbf20cfd54fa9" Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.640750 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-68dh7"] Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.661175 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-68dh7"] Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.668298 4727 scope.go:117] "RemoveContainer" containerID="0dd2e0358359e27c18cb511307a5dcb144cee348103f4b04daa1e2e01f92af05" Oct 01 12:54:11 crc kubenswrapper[4727]: E1001 12:54:11.673156 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dd2e0358359e27c18cb511307a5dcb144cee348103f4b04daa1e2e01f92af05\": container with ID starting with 0dd2e0358359e27c18cb511307a5dcb144cee348103f4b04daa1e2e01f92af05 not found: ID does not exist" containerID="0dd2e0358359e27c18cb511307a5dcb144cee348103f4b04daa1e2e01f92af05" Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.673205 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dd2e0358359e27c18cb511307a5dcb144cee348103f4b04daa1e2e01f92af05"} err="failed to get container status \"0dd2e0358359e27c18cb511307a5dcb144cee348103f4b04daa1e2e01f92af05\": rpc error: code = NotFound desc = could not find container \"0dd2e0358359e27c18cb511307a5dcb144cee348103f4b04daa1e2e01f92af05\": container with ID starting with 0dd2e0358359e27c18cb511307a5dcb144cee348103f4b04daa1e2e01f92af05 not found: ID does not exist" Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.673236 4727 scope.go:117] "RemoveContainer" containerID="654162fd7d0169b4404daeedafcbaf7ed4dc9a4e2f955bb2228cbf20cfd54fa9" Oct 01 12:54:11 crc kubenswrapper[4727]: E1001 12:54:11.677166 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"654162fd7d0169b4404daeedafcbaf7ed4dc9a4e2f955bb2228cbf20cfd54fa9\": container with ID starting with 654162fd7d0169b4404daeedafcbaf7ed4dc9a4e2f955bb2228cbf20cfd54fa9 not found: ID does not exist" containerID="654162fd7d0169b4404daeedafcbaf7ed4dc9a4e2f955bb2228cbf20cfd54fa9" Oct 01 12:54:11 crc kubenswrapper[4727]: I1001 12:54:11.677336 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"654162fd7d0169b4404daeedafcbaf7ed4dc9a4e2f955bb2228cbf20cfd54fa9"} err="failed to get container status \"654162fd7d0169b4404daeedafcbaf7ed4dc9a4e2f955bb2228cbf20cfd54fa9\": rpc error: code = NotFound desc = could not find container \"654162fd7d0169b4404daeedafcbaf7ed4dc9a4e2f955bb2228cbf20cfd54fa9\": container with ID starting with 654162fd7d0169b4404daeedafcbaf7ed4dc9a4e2f955bb2228cbf20cfd54fa9 not found: ID does not exist" Oct 01 12:54:12 crc kubenswrapper[4727]: I1001 12:54:12.384060 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17dd087d-9162-4d17-84fb-49bbcb2c542e" path="/var/lib/kubelet/pods/17dd087d-9162-4d17-84fb-49bbcb2c542e/volumes" Oct 01 12:54:12 crc kubenswrapper[4727]: I1001 12:54:12.588300 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6793dc6-9887-423a-a856-e76d4cddbd83","Type":"ContainerStarted","Data":"755ff3931776ceace522bee03b1bb99987da5e210f5c692892721116e3b6ca89"} Oct 01 12:54:12 crc kubenswrapper[4727]: I1001 12:54:12.588642 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 12:54:12 crc kubenswrapper[4727]: I1001 12:54:12.588669 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b6793dc6-9887-423a-a856-e76d4cddbd83" containerName="ceilometer-central-agent" containerID="cri-o://a867dea50456c905af7cca369f32a5f90a125576639706e1f9cf68035524a09c" gracePeriod=30 Oct 01 12:54:12 crc kubenswrapper[4727]: I1001 12:54:12.588735 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b6793dc6-9887-423a-a856-e76d4cddbd83" containerName="sg-core" containerID="cri-o://f1819b525c6b6d9d122ce6fa459d6d9094ec7a8228e9b074c010d19fe5d89509" gracePeriod=30 Oct 01 12:54:12 crc kubenswrapper[4727]: I1001 12:54:12.588771 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b6793dc6-9887-423a-a856-e76d4cddbd83" containerName="proxy-httpd" containerID="cri-o://755ff3931776ceace522bee03b1bb99987da5e210f5c692892721116e3b6ca89" gracePeriod=30 Oct 01 12:54:12 crc kubenswrapper[4727]: I1001 12:54:12.588740 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b6793dc6-9887-423a-a856-e76d4cddbd83" containerName="ceilometer-notification-agent" containerID="cri-o://d0b1e46d2bbb802fdb455bc8022840f877301defd64578c7979ca7055dfa92c8" gracePeriod=30 Oct 01 12:54:12 crc kubenswrapper[4727]: I1001 12:54:12.616358 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.106660943 podStartE2EDuration="6.61632732s" podCreationTimestamp="2025-10-01 12:54:06 +0000 UTC" firstStartedPulling="2025-10-01 12:54:07.491262307 +0000 UTC m=+1025.812617144" lastFinishedPulling="2025-10-01 12:54:12.000928674 +0000 UTC m=+1030.322283521" observedRunningTime="2025-10-01 12:54:12.61316215 +0000 UTC m=+1030.934516997" watchObservedRunningTime="2025-10-01 12:54:12.61632732 +0000 UTC m=+1030.937682157" Oct 01 12:54:13 crc kubenswrapper[4727]: I1001 12:54:13.603922 4727 generic.go:334] "Generic (PLEG): container finished" podID="b6793dc6-9887-423a-a856-e76d4cddbd83" containerID="755ff3931776ceace522bee03b1bb99987da5e210f5c692892721116e3b6ca89" exitCode=0 Oct 01 12:54:13 crc kubenswrapper[4727]: I1001 12:54:13.604305 4727 generic.go:334] "Generic (PLEG): container finished" podID="b6793dc6-9887-423a-a856-e76d4cddbd83" containerID="f1819b525c6b6d9d122ce6fa459d6d9094ec7a8228e9b074c010d19fe5d89509" exitCode=2 Oct 01 12:54:13 crc kubenswrapper[4727]: I1001 12:54:13.604317 4727 generic.go:334] "Generic (PLEG): container finished" podID="b6793dc6-9887-423a-a856-e76d4cddbd83" containerID="d0b1e46d2bbb802fdb455bc8022840f877301defd64578c7979ca7055dfa92c8" exitCode=0 Oct 01 12:54:13 crc kubenswrapper[4727]: I1001 12:54:13.604033 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6793dc6-9887-423a-a856-e76d4cddbd83","Type":"ContainerDied","Data":"755ff3931776ceace522bee03b1bb99987da5e210f5c692892721116e3b6ca89"} Oct 01 12:54:13 crc kubenswrapper[4727]: I1001 12:54:13.604414 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6793dc6-9887-423a-a856-e76d4cddbd83","Type":"ContainerDied","Data":"f1819b525c6b6d9d122ce6fa459d6d9094ec7a8228e9b074c010d19fe5d89509"} Oct 01 12:54:13 crc kubenswrapper[4727]: I1001 12:54:13.604429 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6793dc6-9887-423a-a856-e76d4cddbd83","Type":"ContainerDied","Data":"d0b1e46d2bbb802fdb455bc8022840f877301defd64578c7979ca7055dfa92c8"} Oct 01 12:54:13 crc kubenswrapper[4727]: I1001 12:54:13.618785 4727 generic.go:334] "Generic (PLEG): container finished" podID="edc05528-2111-4383-b904-8ad44aaa0a11" containerID="308149e451fba753c6efce492fb50cbdee2025cd972b0225b81814bca3152252" exitCode=0 Oct 01 12:54:13 crc kubenswrapper[4727]: I1001 12:54:13.618880 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"edc05528-2111-4383-b904-8ad44aaa0a11","Type":"ContainerDied","Data":"308149e451fba753c6efce492fb50cbdee2025cd972b0225b81814bca3152252"} Oct 01 12:54:14 crc kubenswrapper[4727]: I1001 12:54:14.635509 4727 generic.go:334] "Generic (PLEG): container finished" podID="b6793dc6-9887-423a-a856-e76d4cddbd83" containerID="a867dea50456c905af7cca369f32a5f90a125576639706e1f9cf68035524a09c" exitCode=0 Oct 01 12:54:14 crc kubenswrapper[4727]: I1001 12:54:14.635633 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6793dc6-9887-423a-a856-e76d4cddbd83","Type":"ContainerDied","Data":"a867dea50456c905af7cca369f32a5f90a125576639706e1f9cf68035524a09c"} Oct 01 12:54:14 crc kubenswrapper[4727]: I1001 12:54:14.639491 4727 generic.go:334] "Generic (PLEG): container finished" podID="edc05528-2111-4383-b904-8ad44aaa0a11" containerID="b524ea1a6d9190ce21bc32ec99b52aa00b9c006ce26c99420f2c6fa628f5d87c" exitCode=0 Oct 01 12:54:14 crc kubenswrapper[4727]: I1001 12:54:14.639540 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"edc05528-2111-4383-b904-8ad44aaa0a11","Type":"ContainerDied","Data":"b524ea1a6d9190ce21bc32ec99b52aa00b9c006ce26c99420f2c6fa628f5d87c"} Oct 01 12:54:19 crc kubenswrapper[4727]: I1001 12:54:19.271294 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-nn6ql"] Oct 01 12:54:19 crc kubenswrapper[4727]: E1001 12:54:19.272460 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17dd087d-9162-4d17-84fb-49bbcb2c542e" containerName="init" Oct 01 12:54:19 crc kubenswrapper[4727]: I1001 12:54:19.272478 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="17dd087d-9162-4d17-84fb-49bbcb2c542e" containerName="init" Oct 01 12:54:19 crc kubenswrapper[4727]: E1001 12:54:19.272518 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17dd087d-9162-4d17-84fb-49bbcb2c542e" containerName="dnsmasq-dns" Oct 01 12:54:19 crc kubenswrapper[4727]: I1001 12:54:19.272526 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="17dd087d-9162-4d17-84fb-49bbcb2c542e" containerName="dnsmasq-dns" Oct 01 12:54:19 crc kubenswrapper[4727]: I1001 12:54:19.272782 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="17dd087d-9162-4d17-84fb-49bbcb2c542e" containerName="dnsmasq-dns" Oct 01 12:54:19 crc kubenswrapper[4727]: I1001 12:54:19.273552 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-nn6ql" Oct 01 12:54:19 crc kubenswrapper[4727]: I1001 12:54:19.293274 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-nn6ql"] Oct 01 12:54:19 crc kubenswrapper[4727]: I1001 12:54:19.320208 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlx2v\" (UniqueName: \"kubernetes.io/projected/f1c4e0f3-9f27-4304-a848-6b3482161126-kube-api-access-tlx2v\") pod \"nova-api-db-create-nn6ql\" (UID: \"f1c4e0f3-9f27-4304-a848-6b3482161126\") " pod="openstack/nova-api-db-create-nn6ql" Oct 01 12:54:19 crc kubenswrapper[4727]: I1001 12:54:19.430015 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlx2v\" (UniqueName: \"kubernetes.io/projected/f1c4e0f3-9f27-4304-a848-6b3482161126-kube-api-access-tlx2v\") pod \"nova-api-db-create-nn6ql\" (UID: \"f1c4e0f3-9f27-4304-a848-6b3482161126\") " pod="openstack/nova-api-db-create-nn6ql" Oct 01 12:54:19 crc kubenswrapper[4727]: I1001 12:54:19.439057 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-4jc2p"] Oct 01 12:54:19 crc kubenswrapper[4727]: I1001 12:54:19.449383 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4jc2p" Oct 01 12:54:19 crc kubenswrapper[4727]: I1001 12:54:19.465317 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-4jc2p"] Oct 01 12:54:19 crc kubenswrapper[4727]: I1001 12:54:19.505412 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlx2v\" (UniqueName: \"kubernetes.io/projected/f1c4e0f3-9f27-4304-a848-6b3482161126-kube-api-access-tlx2v\") pod \"nova-api-db-create-nn6ql\" (UID: \"f1c4e0f3-9f27-4304-a848-6b3482161126\") " pod="openstack/nova-api-db-create-nn6ql" Oct 01 12:54:19 crc kubenswrapper[4727]: I1001 12:54:19.531414 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpntb\" (UniqueName: \"kubernetes.io/projected/586aeab7-2b38-400f-827b-6ea16b3bf9c4-kube-api-access-dpntb\") pod \"nova-cell0-db-create-4jc2p\" (UID: \"586aeab7-2b38-400f-827b-6ea16b3bf9c4\") " pod="openstack/nova-cell0-db-create-4jc2p" Oct 01 12:54:19 crc kubenswrapper[4727]: I1001 12:54:19.564334 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-7zj4h"] Oct 01 12:54:19 crc kubenswrapper[4727]: I1001 12:54:19.568267 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7zj4h" Oct 01 12:54:19 crc kubenswrapper[4727]: I1001 12:54:19.580850 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-7zj4h"] Oct 01 12:54:19 crc kubenswrapper[4727]: I1001 12:54:19.596697 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-nn6ql" Oct 01 12:54:19 crc kubenswrapper[4727]: I1001 12:54:19.633223 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwbsr\" (UniqueName: \"kubernetes.io/projected/8df368e0-d59d-40bd-8aea-d68ab67ba406-kube-api-access-xwbsr\") pod \"nova-cell1-db-create-7zj4h\" (UID: \"8df368e0-d59d-40bd-8aea-d68ab67ba406\") " pod="openstack/nova-cell1-db-create-7zj4h" Oct 01 12:54:19 crc kubenswrapper[4727]: I1001 12:54:19.633424 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpntb\" (UniqueName: \"kubernetes.io/projected/586aeab7-2b38-400f-827b-6ea16b3bf9c4-kube-api-access-dpntb\") pod \"nova-cell0-db-create-4jc2p\" (UID: \"586aeab7-2b38-400f-827b-6ea16b3bf9c4\") " pod="openstack/nova-cell0-db-create-4jc2p" Oct 01 12:54:19 crc kubenswrapper[4727]: I1001 12:54:19.662730 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpntb\" (UniqueName: \"kubernetes.io/projected/586aeab7-2b38-400f-827b-6ea16b3bf9c4-kube-api-access-dpntb\") pod \"nova-cell0-db-create-4jc2p\" (UID: \"586aeab7-2b38-400f-827b-6ea16b3bf9c4\") " pod="openstack/nova-cell0-db-create-4jc2p" Oct 01 12:54:19 crc kubenswrapper[4727]: I1001 12:54:19.737269 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwbsr\" (UniqueName: \"kubernetes.io/projected/8df368e0-d59d-40bd-8aea-d68ab67ba406-kube-api-access-xwbsr\") pod \"nova-cell1-db-create-7zj4h\" (UID: \"8df368e0-d59d-40bd-8aea-d68ab67ba406\") " pod="openstack/nova-cell1-db-create-7zj4h" Oct 01 12:54:19 crc kubenswrapper[4727]: I1001 12:54:19.765784 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwbsr\" (UniqueName: \"kubernetes.io/projected/8df368e0-d59d-40bd-8aea-d68ab67ba406-kube-api-access-xwbsr\") pod \"nova-cell1-db-create-7zj4h\" (UID: \"8df368e0-d59d-40bd-8aea-d68ab67ba406\") " pod="openstack/nova-cell1-db-create-7zj4h" Oct 01 12:54:19 crc kubenswrapper[4727]: I1001 12:54:19.810657 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 01 12:54:19 crc kubenswrapper[4727]: I1001 12:54:19.857668 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4jc2p" Oct 01 12:54:19 crc kubenswrapper[4727]: I1001 12:54:19.886183 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7zj4h" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.150228 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.247935 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjp8k\" (UniqueName: \"kubernetes.io/projected/b6793dc6-9887-423a-a856-e76d4cddbd83-kube-api-access-vjp8k\") pod \"b6793dc6-9887-423a-a856-e76d4cddbd83\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.248113 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6793dc6-9887-423a-a856-e76d4cddbd83-run-httpd\") pod \"b6793dc6-9887-423a-a856-e76d4cddbd83\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.248191 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6793dc6-9887-423a-a856-e76d4cddbd83-config-data\") pod \"b6793dc6-9887-423a-a856-e76d4cddbd83\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.248220 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6793dc6-9887-423a-a856-e76d4cddbd83-combined-ca-bundle\") pod \"b6793dc6-9887-423a-a856-e76d4cddbd83\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.248296 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b6793dc6-9887-423a-a856-e76d4cddbd83-sg-core-conf-yaml\") pod \"b6793dc6-9887-423a-a856-e76d4cddbd83\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.248425 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6793dc6-9887-423a-a856-e76d4cddbd83-scripts\") pod \"b6793dc6-9887-423a-a856-e76d4cddbd83\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.248495 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6793dc6-9887-423a-a856-e76d4cddbd83-log-httpd\") pod \"b6793dc6-9887-423a-a856-e76d4cddbd83\" (UID: \"b6793dc6-9887-423a-a856-e76d4cddbd83\") " Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.249786 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6793dc6-9887-423a-a856-e76d4cddbd83-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b6793dc6-9887-423a-a856-e76d4cddbd83" (UID: "b6793dc6-9887-423a-a856-e76d4cddbd83"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.250990 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6793dc6-9887-423a-a856-e76d4cddbd83-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b6793dc6-9887-423a-a856-e76d4cddbd83" (UID: "b6793dc6-9887-423a-a856-e76d4cddbd83"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.256032 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6793dc6-9887-423a-a856-e76d4cddbd83-scripts" (OuterVolumeSpecName: "scripts") pod "b6793dc6-9887-423a-a856-e76d4cddbd83" (UID: "b6793dc6-9887-423a-a856-e76d4cddbd83"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.260181 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6793dc6-9887-423a-a856-e76d4cddbd83-kube-api-access-vjp8k" (OuterVolumeSpecName: "kube-api-access-vjp8k") pod "b6793dc6-9887-423a-a856-e76d4cddbd83" (UID: "b6793dc6-9887-423a-a856-e76d4cddbd83"). InnerVolumeSpecName "kube-api-access-vjp8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.304739 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6793dc6-9887-423a-a856-e76d4cddbd83-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b6793dc6-9887-423a-a856-e76d4cddbd83" (UID: "b6793dc6-9887-423a-a856-e76d4cddbd83"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.315561 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.349941 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edc05528-2111-4383-b904-8ad44aaa0a11-scripts\") pod \"edc05528-2111-4383-b904-8ad44aaa0a11\" (UID: \"edc05528-2111-4383-b904-8ad44aaa0a11\") " Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.350012 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/edc05528-2111-4383-b904-8ad44aaa0a11-etc-machine-id\") pod \"edc05528-2111-4383-b904-8ad44aaa0a11\" (UID: \"edc05528-2111-4383-b904-8ad44aaa0a11\") " Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.350087 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc05528-2111-4383-b904-8ad44aaa0a11-combined-ca-bundle\") pod \"edc05528-2111-4383-b904-8ad44aaa0a11\" (UID: \"edc05528-2111-4383-b904-8ad44aaa0a11\") " Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.350153 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hrlk\" (UniqueName: \"kubernetes.io/projected/edc05528-2111-4383-b904-8ad44aaa0a11-kube-api-access-7hrlk\") pod \"edc05528-2111-4383-b904-8ad44aaa0a11\" (UID: \"edc05528-2111-4383-b904-8ad44aaa0a11\") " Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.350270 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edc05528-2111-4383-b904-8ad44aaa0a11-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "edc05528-2111-4383-b904-8ad44aaa0a11" (UID: "edc05528-2111-4383-b904-8ad44aaa0a11"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.350864 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/edc05528-2111-4383-b904-8ad44aaa0a11-config-data-custom\") pod \"edc05528-2111-4383-b904-8ad44aaa0a11\" (UID: \"edc05528-2111-4383-b904-8ad44aaa0a11\") " Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.350922 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edc05528-2111-4383-b904-8ad44aaa0a11-config-data\") pod \"edc05528-2111-4383-b904-8ad44aaa0a11\" (UID: \"edc05528-2111-4383-b904-8ad44aaa0a11\") " Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.359938 4727 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/edc05528-2111-4383-b904-8ad44aaa0a11-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.360030 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjp8k\" (UniqueName: \"kubernetes.io/projected/b6793dc6-9887-423a-a856-e76d4cddbd83-kube-api-access-vjp8k\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.360047 4727 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6793dc6-9887-423a-a856-e76d4cddbd83-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.360058 4727 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b6793dc6-9887-423a-a856-e76d4cddbd83-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.360073 4727 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6793dc6-9887-423a-a856-e76d4cddbd83-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.360084 4727 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6793dc6-9887-423a-a856-e76d4cddbd83-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.359954 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edc05528-2111-4383-b904-8ad44aaa0a11-kube-api-access-7hrlk" (OuterVolumeSpecName: "kube-api-access-7hrlk") pod "edc05528-2111-4383-b904-8ad44aaa0a11" (UID: "edc05528-2111-4383-b904-8ad44aaa0a11"). InnerVolumeSpecName "kube-api-access-7hrlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.365088 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edc05528-2111-4383-b904-8ad44aaa0a11-scripts" (OuterVolumeSpecName: "scripts") pod "edc05528-2111-4383-b904-8ad44aaa0a11" (UID: "edc05528-2111-4383-b904-8ad44aaa0a11"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.365251 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edc05528-2111-4383-b904-8ad44aaa0a11-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "edc05528-2111-4383-b904-8ad44aaa0a11" (UID: "edc05528-2111-4383-b904-8ad44aaa0a11"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.407121 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6793dc6-9887-423a-a856-e76d4cddbd83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6793dc6-9887-423a-a856-e76d4cddbd83" (UID: "b6793dc6-9887-423a-a856-e76d4cddbd83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.424800 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6793dc6-9887-423a-a856-e76d4cddbd83-config-data" (OuterVolumeSpecName: "config-data") pod "b6793dc6-9887-423a-a856-e76d4cddbd83" (UID: "b6793dc6-9887-423a-a856-e76d4cddbd83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.428896 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edc05528-2111-4383-b904-8ad44aaa0a11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "edc05528-2111-4383-b904-8ad44aaa0a11" (UID: "edc05528-2111-4383-b904-8ad44aaa0a11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.447861 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-nn6ql"] Oct 01 12:54:20 crc kubenswrapper[4727]: W1001 12:54:20.460266 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1c4e0f3_9f27_4304_a848_6b3482161126.slice/crio-6f5a20ec850dc2c74ed29f50da84d3d1e3a92c6acddf45ae1273c5af152d606e WatchSource:0}: Error finding container 6f5a20ec850dc2c74ed29f50da84d3d1e3a92c6acddf45ae1273c5af152d606e: Status 404 returned error can't find the container with id 6f5a20ec850dc2c74ed29f50da84d3d1e3a92c6acddf45ae1273c5af152d606e Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.461756 4727 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edc05528-2111-4383-b904-8ad44aaa0a11-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.461784 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edc05528-2111-4383-b904-8ad44aaa0a11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.461809 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hrlk\" (UniqueName: \"kubernetes.io/projected/edc05528-2111-4383-b904-8ad44aaa0a11-kube-api-access-7hrlk\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.461822 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6793dc6-9887-423a-a856-e76d4cddbd83-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.461835 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6793dc6-9887-423a-a856-e76d4cddbd83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.461847 4727 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/edc05528-2111-4383-b904-8ad44aaa0a11-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.607516 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-4jc2p"] Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.624196 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edc05528-2111-4383-b904-8ad44aaa0a11-config-data" (OuterVolumeSpecName: "config-data") pod "edc05528-2111-4383-b904-8ad44aaa0a11" (UID: "edc05528-2111-4383-b904-8ad44aaa0a11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.665433 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edc05528-2111-4383-b904-8ad44aaa0a11-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.719206 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-7zj4h"] Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.721407 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6793dc6-9887-423a-a856-e76d4cddbd83","Type":"ContainerDied","Data":"439bf06e39eee40d1f3a7f86da991c1039c9b565a1243eae8c9df8c6c56b0c5e"} Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.721470 4727 scope.go:117] "RemoveContainer" containerID="755ff3931776ceace522bee03b1bb99987da5e210f5c692892721116e3b6ca89" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.721657 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:54:20 crc kubenswrapper[4727]: W1001 12:54:20.730480 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8df368e0_d59d_40bd_8aea_d68ab67ba406.slice/crio-17314e1c4cf4780e0cb9ba6119f8b610cdaa568be669488d3786c922dfc435a7 WatchSource:0}: Error finding container 17314e1c4cf4780e0cb9ba6119f8b610cdaa568be669488d3786c922dfc435a7: Status 404 returned error can't find the container with id 17314e1c4cf4780e0cb9ba6119f8b610cdaa568be669488d3786c922dfc435a7 Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.732336 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"edc05528-2111-4383-b904-8ad44aaa0a11","Type":"ContainerDied","Data":"fe7edf6f8484d18cf390652800d7cd5a44c24eb4300debb39bc9969082b4b24c"} Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.732383 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.734944 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"fc493472-2f4d-4d92-9ba3-22850bd45ae6","Type":"ContainerStarted","Data":"55618a94a9e20a452f593c8f5b8d568c7b3724d7f5f8403144252acf97bf64c7"} Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.740211 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4jc2p" event={"ID":"586aeab7-2b38-400f-827b-6ea16b3bf9c4","Type":"ContainerStarted","Data":"3302da2c5312af880008c8b660a4521b4934e0b12ae7aa4d4eb15341b85fbcbe"} Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.746561 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-nn6ql" event={"ID":"f1c4e0f3-9f27-4304-a848-6b3482161126","Type":"ContainerStarted","Data":"6f5a20ec850dc2c74ed29f50da84d3d1e3a92c6acddf45ae1273c5af152d606e"} Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.788800 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.550838032 podStartE2EDuration="23.788773744s" podCreationTimestamp="2025-10-01 12:53:57 +0000 UTC" firstStartedPulling="2025-10-01 12:53:58.541661084 +0000 UTC m=+1016.863015931" lastFinishedPulling="2025-10-01 12:54:19.779596796 +0000 UTC m=+1038.100951643" observedRunningTime="2025-10-01 12:54:20.777231413 +0000 UTC m=+1039.098586250" watchObservedRunningTime="2025-10-01 12:54:20.788773744 +0000 UTC m=+1039.110128611" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.812761 4727 scope.go:117] "RemoveContainer" containerID="f1819b525c6b6d9d122ce6fa459d6d9094ec7a8228e9b074c010d19fe5d89509" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.886516 4727 scope.go:117] "RemoveContainer" containerID="d0b1e46d2bbb802fdb455bc8022840f877301defd64578c7979ca7055dfa92c8" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.904644 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.915758 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.935750 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.936207 4727 scope.go:117] "RemoveContainer" containerID="a867dea50456c905af7cca369f32a5f90a125576639706e1f9cf68035524a09c" Oct 01 12:54:20 crc kubenswrapper[4727]: E1001 12:54:20.936264 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6793dc6-9887-423a-a856-e76d4cddbd83" containerName="ceilometer-notification-agent" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.936284 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6793dc6-9887-423a-a856-e76d4cddbd83" containerName="ceilometer-notification-agent" Oct 01 12:54:20 crc kubenswrapper[4727]: E1001 12:54:20.936302 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edc05528-2111-4383-b904-8ad44aaa0a11" containerName="probe" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.936312 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="edc05528-2111-4383-b904-8ad44aaa0a11" containerName="probe" Oct 01 12:54:20 crc kubenswrapper[4727]: E1001 12:54:20.936326 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6793dc6-9887-423a-a856-e76d4cddbd83" containerName="sg-core" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.936334 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6793dc6-9887-423a-a856-e76d4cddbd83" containerName="sg-core" Oct 01 12:54:20 crc kubenswrapper[4727]: E1001 12:54:20.936348 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edc05528-2111-4383-b904-8ad44aaa0a11" containerName="cinder-scheduler" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.936357 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="edc05528-2111-4383-b904-8ad44aaa0a11" containerName="cinder-scheduler" Oct 01 12:54:20 crc kubenswrapper[4727]: E1001 12:54:20.936379 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6793dc6-9887-423a-a856-e76d4cddbd83" containerName="proxy-httpd" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.936387 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6793dc6-9887-423a-a856-e76d4cddbd83" containerName="proxy-httpd" Oct 01 12:54:20 crc kubenswrapper[4727]: E1001 12:54:20.936411 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6793dc6-9887-423a-a856-e76d4cddbd83" containerName="ceilometer-central-agent" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.936419 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6793dc6-9887-423a-a856-e76d4cddbd83" containerName="ceilometer-central-agent" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.936646 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6793dc6-9887-423a-a856-e76d4cddbd83" containerName="proxy-httpd" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.936671 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6793dc6-9887-423a-a856-e76d4cddbd83" containerName="ceilometer-central-agent" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.936686 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="edc05528-2111-4383-b904-8ad44aaa0a11" containerName="cinder-scheduler" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.936698 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6793dc6-9887-423a-a856-e76d4cddbd83" containerName="ceilometer-notification-agent" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.936723 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6793dc6-9887-423a-a856-e76d4cddbd83" containerName="sg-core" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.936738 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="edc05528-2111-4383-b904-8ad44aaa0a11" containerName="probe" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.938839 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.942496 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.943489 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.956793 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.977140 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.976275 4727 scope.go:117] "RemoveContainer" containerID="308149e451fba753c6efce492fb50cbdee2025cd972b0225b81814bca3152252" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.981676 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bde180d2-4139-4068-8702-4a8b8b21ffe8-config-data\") pod \"ceilometer-0\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " pod="openstack/ceilometer-0" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.981787 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bde180d2-4139-4068-8702-4a8b8b21ffe8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " pod="openstack/ceilometer-0" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.981848 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcjx2\" (UniqueName: \"kubernetes.io/projected/bde180d2-4139-4068-8702-4a8b8b21ffe8-kube-api-access-lcjx2\") pod \"ceilometer-0\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " pod="openstack/ceilometer-0" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.981869 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bde180d2-4139-4068-8702-4a8b8b21ffe8-log-httpd\") pod \"ceilometer-0\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " pod="openstack/ceilometer-0" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.981930 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bde180d2-4139-4068-8702-4a8b8b21ffe8-run-httpd\") pod \"ceilometer-0\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " pod="openstack/ceilometer-0" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.982045 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bde180d2-4139-4068-8702-4a8b8b21ffe8-scripts\") pod \"ceilometer-0\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " pod="openstack/ceilometer-0" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.982115 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bde180d2-4139-4068-8702-4a8b8b21ffe8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " pod="openstack/ceilometer-0" Oct 01 12:54:20 crc kubenswrapper[4727]: I1001 12:54:20.990069 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.008921 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.012246 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.020264 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.055018 4727 scope.go:117] "RemoveContainer" containerID="b524ea1a6d9190ce21bc32ec99b52aa00b9c006ce26c99420f2c6fa628f5d87c" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.078901 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.085392 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bde180d2-4139-4068-8702-4a8b8b21ffe8-run-httpd\") pod \"ceilometer-0\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " pod="openstack/ceilometer-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.085481 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b26dc55a-af6d-4797-a736-e1ad576aef99-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b26dc55a-af6d-4797-a736-e1ad576aef99\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.085505 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b26dc55a-af6d-4797-a736-e1ad576aef99-scripts\") pod \"cinder-scheduler-0\" (UID: \"b26dc55a-af6d-4797-a736-e1ad576aef99\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.085581 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bde180d2-4139-4068-8702-4a8b8b21ffe8-scripts\") pod \"ceilometer-0\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " pod="openstack/ceilometer-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.085605 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz7jb\" (UniqueName: \"kubernetes.io/projected/b26dc55a-af6d-4797-a736-e1ad576aef99-kube-api-access-zz7jb\") pod \"cinder-scheduler-0\" (UID: \"b26dc55a-af6d-4797-a736-e1ad576aef99\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.085668 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bde180d2-4139-4068-8702-4a8b8b21ffe8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " pod="openstack/ceilometer-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.085702 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bde180d2-4139-4068-8702-4a8b8b21ffe8-config-data\") pod \"ceilometer-0\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " pod="openstack/ceilometer-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.085725 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b26dc55a-af6d-4797-a736-e1ad576aef99-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b26dc55a-af6d-4797-a736-e1ad576aef99\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.085758 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bde180d2-4139-4068-8702-4a8b8b21ffe8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " pod="openstack/ceilometer-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.085781 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b26dc55a-af6d-4797-a736-e1ad576aef99-config-data\") pod \"cinder-scheduler-0\" (UID: \"b26dc55a-af6d-4797-a736-e1ad576aef99\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.085825 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bde180d2-4139-4068-8702-4a8b8b21ffe8-log-httpd\") pod \"ceilometer-0\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " pod="openstack/ceilometer-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.085847 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcjx2\" (UniqueName: \"kubernetes.io/projected/bde180d2-4139-4068-8702-4a8b8b21ffe8-kube-api-access-lcjx2\") pod \"ceilometer-0\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " pod="openstack/ceilometer-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.085872 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b26dc55a-af6d-4797-a736-e1ad576aef99-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b26dc55a-af6d-4797-a736-e1ad576aef99\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.086918 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bde180d2-4139-4068-8702-4a8b8b21ffe8-run-httpd\") pod \"ceilometer-0\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " pod="openstack/ceilometer-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.086944 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bde180d2-4139-4068-8702-4a8b8b21ffe8-log-httpd\") pod \"ceilometer-0\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " pod="openstack/ceilometer-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.099016 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bde180d2-4139-4068-8702-4a8b8b21ffe8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " pod="openstack/ceilometer-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.101835 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bde180d2-4139-4068-8702-4a8b8b21ffe8-scripts\") pod \"ceilometer-0\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " pod="openstack/ceilometer-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.102338 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bde180d2-4139-4068-8702-4a8b8b21ffe8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " pod="openstack/ceilometer-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.103509 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bde180d2-4139-4068-8702-4a8b8b21ffe8-config-data\") pod \"ceilometer-0\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " pod="openstack/ceilometer-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.114776 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcjx2\" (UniqueName: \"kubernetes.io/projected/bde180d2-4139-4068-8702-4a8b8b21ffe8-kube-api-access-lcjx2\") pod \"ceilometer-0\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " pod="openstack/ceilometer-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.187693 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b26dc55a-af6d-4797-a736-e1ad576aef99-scripts\") pod \"cinder-scheduler-0\" (UID: \"b26dc55a-af6d-4797-a736-e1ad576aef99\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.187814 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz7jb\" (UniqueName: \"kubernetes.io/projected/b26dc55a-af6d-4797-a736-e1ad576aef99-kube-api-access-zz7jb\") pod \"cinder-scheduler-0\" (UID: \"b26dc55a-af6d-4797-a736-e1ad576aef99\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.187899 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b26dc55a-af6d-4797-a736-e1ad576aef99-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b26dc55a-af6d-4797-a736-e1ad576aef99\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.187935 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b26dc55a-af6d-4797-a736-e1ad576aef99-config-data\") pod \"cinder-scheduler-0\" (UID: \"b26dc55a-af6d-4797-a736-e1ad576aef99\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.187980 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b26dc55a-af6d-4797-a736-e1ad576aef99-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b26dc55a-af6d-4797-a736-e1ad576aef99\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.188056 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b26dc55a-af6d-4797-a736-e1ad576aef99-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b26dc55a-af6d-4797-a736-e1ad576aef99\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.188386 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b26dc55a-af6d-4797-a736-e1ad576aef99-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b26dc55a-af6d-4797-a736-e1ad576aef99\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.193731 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b26dc55a-af6d-4797-a736-e1ad576aef99-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b26dc55a-af6d-4797-a736-e1ad576aef99\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.193844 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b26dc55a-af6d-4797-a736-e1ad576aef99-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b26dc55a-af6d-4797-a736-e1ad576aef99\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.196315 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b26dc55a-af6d-4797-a736-e1ad576aef99-scripts\") pod \"cinder-scheduler-0\" (UID: \"b26dc55a-af6d-4797-a736-e1ad576aef99\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.198468 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b26dc55a-af6d-4797-a736-e1ad576aef99-config-data\") pod \"cinder-scheduler-0\" (UID: \"b26dc55a-af6d-4797-a736-e1ad576aef99\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.211489 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz7jb\" (UniqueName: \"kubernetes.io/projected/b26dc55a-af6d-4797-a736-e1ad576aef99-kube-api-access-zz7jb\") pod \"cinder-scheduler-0\" (UID: \"b26dc55a-af6d-4797-a736-e1ad576aef99\") " pod="openstack/cinder-scheduler-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.275424 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.342150 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.743126 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.764163 4727 generic.go:334] "Generic (PLEG): container finished" podID="8df368e0-d59d-40bd-8aea-d68ab67ba406" containerID="7290d19e504390303179c1a68cd83f65b5f2b38c68624f601ddb3c777e4639d1" exitCode=0 Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.764247 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7zj4h" event={"ID":"8df368e0-d59d-40bd-8aea-d68ab67ba406","Type":"ContainerDied","Data":"7290d19e504390303179c1a68cd83f65b5f2b38c68624f601ddb3c777e4639d1"} Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.764279 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7zj4h" event={"ID":"8df368e0-d59d-40bd-8aea-d68ab67ba406","Type":"ContainerStarted","Data":"17314e1c4cf4780e0cb9ba6119f8b610cdaa568be669488d3786c922dfc435a7"} Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.766720 4727 generic.go:334] "Generic (PLEG): container finished" podID="586aeab7-2b38-400f-827b-6ea16b3bf9c4" containerID="b873a6cc63eb4f0f842a93ec4f4daf58ccf4dd4beb19ee861959f46446b44f5d" exitCode=0 Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.766775 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4jc2p" event={"ID":"586aeab7-2b38-400f-827b-6ea16b3bf9c4","Type":"ContainerDied","Data":"b873a6cc63eb4f0f842a93ec4f4daf58ccf4dd4beb19ee861959f46446b44f5d"} Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.768665 4727 generic.go:334] "Generic (PLEG): container finished" podID="f1c4e0f3-9f27-4304-a848-6b3482161126" containerID="d8ae7785c1f1ddeaba7fe147dec8eae40ca763c569b1d9fdc919bcde0ab4aed9" exitCode=0 Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.768743 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-nn6ql" event={"ID":"f1c4e0f3-9f27-4304-a848-6b3482161126","Type":"ContainerDied","Data":"d8ae7785c1f1ddeaba7fe147dec8eae40ca763c569b1d9fdc919bcde0ab4aed9"} Oct 01 12:54:21 crc kubenswrapper[4727]: I1001 12:54:21.885530 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 12:54:22 crc kubenswrapper[4727]: I1001 12:54:22.394363 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6793dc6-9887-423a-a856-e76d4cddbd83" path="/var/lib/kubelet/pods/b6793dc6-9887-423a-a856-e76d4cddbd83/volumes" Oct 01 12:54:22 crc kubenswrapper[4727]: I1001 12:54:22.395319 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edc05528-2111-4383-b904-8ad44aaa0a11" path="/var/lib/kubelet/pods/edc05528-2111-4383-b904-8ad44aaa0a11/volumes" Oct 01 12:54:22 crc kubenswrapper[4727]: I1001 12:54:22.783067 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bde180d2-4139-4068-8702-4a8b8b21ffe8","Type":"ContainerStarted","Data":"e678b2bf3d7b30718e2f15597e3f46ed6770ec16701fa6e230da0dd2ddfbbc9e"} Oct 01 12:54:22 crc kubenswrapper[4727]: I1001 12:54:22.783422 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bde180d2-4139-4068-8702-4a8b8b21ffe8","Type":"ContainerStarted","Data":"58348455aae1130e255a8849d81db58cd5e39e931b443d72bfa0e67bf55129b7"} Oct 01 12:54:22 crc kubenswrapper[4727]: I1001 12:54:22.790873 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b26dc55a-af6d-4797-a736-e1ad576aef99","Type":"ContainerStarted","Data":"1cbc8d10ae5024c1eaef985c1a1a3d08ceccb1cf49d4ed6b349dd85164d7b222"} Oct 01 12:54:22 crc kubenswrapper[4727]: I1001 12:54:22.790942 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b26dc55a-af6d-4797-a736-e1ad576aef99","Type":"ContainerStarted","Data":"594ca2cbf50acc82ce359859f3678576531253b0053c0ac91c8bdb82bc27a826"} Oct 01 12:54:23 crc kubenswrapper[4727]: I1001 12:54:23.223748 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4jc2p" Oct 01 12:54:23 crc kubenswrapper[4727]: I1001 12:54:23.334071 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpntb\" (UniqueName: \"kubernetes.io/projected/586aeab7-2b38-400f-827b-6ea16b3bf9c4-kube-api-access-dpntb\") pod \"586aeab7-2b38-400f-827b-6ea16b3bf9c4\" (UID: \"586aeab7-2b38-400f-827b-6ea16b3bf9c4\") " Oct 01 12:54:23 crc kubenswrapper[4727]: I1001 12:54:23.341383 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/586aeab7-2b38-400f-827b-6ea16b3bf9c4-kube-api-access-dpntb" (OuterVolumeSpecName: "kube-api-access-dpntb") pod "586aeab7-2b38-400f-827b-6ea16b3bf9c4" (UID: "586aeab7-2b38-400f-827b-6ea16b3bf9c4"). InnerVolumeSpecName "kube-api-access-dpntb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:54:23 crc kubenswrapper[4727]: I1001 12:54:23.394214 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-nn6ql" Oct 01 12:54:23 crc kubenswrapper[4727]: I1001 12:54:23.428240 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7zj4h" Oct 01 12:54:23 crc kubenswrapper[4727]: I1001 12:54:23.437330 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlx2v\" (UniqueName: \"kubernetes.io/projected/f1c4e0f3-9f27-4304-a848-6b3482161126-kube-api-access-tlx2v\") pod \"f1c4e0f3-9f27-4304-a848-6b3482161126\" (UID: \"f1c4e0f3-9f27-4304-a848-6b3482161126\") " Oct 01 12:54:23 crc kubenswrapper[4727]: I1001 12:54:23.438438 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpntb\" (UniqueName: \"kubernetes.io/projected/586aeab7-2b38-400f-827b-6ea16b3bf9c4-kube-api-access-dpntb\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:23 crc kubenswrapper[4727]: I1001 12:54:23.451492 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1c4e0f3-9f27-4304-a848-6b3482161126-kube-api-access-tlx2v" (OuterVolumeSpecName: "kube-api-access-tlx2v") pod "f1c4e0f3-9f27-4304-a848-6b3482161126" (UID: "f1c4e0f3-9f27-4304-a848-6b3482161126"). InnerVolumeSpecName "kube-api-access-tlx2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:54:23 crc kubenswrapper[4727]: I1001 12:54:23.542706 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwbsr\" (UniqueName: \"kubernetes.io/projected/8df368e0-d59d-40bd-8aea-d68ab67ba406-kube-api-access-xwbsr\") pod \"8df368e0-d59d-40bd-8aea-d68ab67ba406\" (UID: \"8df368e0-d59d-40bd-8aea-d68ab67ba406\") " Oct 01 12:54:23 crc kubenswrapper[4727]: I1001 12:54:23.543665 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlx2v\" (UniqueName: \"kubernetes.io/projected/f1c4e0f3-9f27-4304-a848-6b3482161126-kube-api-access-tlx2v\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:23 crc kubenswrapper[4727]: I1001 12:54:23.551203 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8df368e0-d59d-40bd-8aea-d68ab67ba406-kube-api-access-xwbsr" (OuterVolumeSpecName: "kube-api-access-xwbsr") pod "8df368e0-d59d-40bd-8aea-d68ab67ba406" (UID: "8df368e0-d59d-40bd-8aea-d68ab67ba406"). InnerVolumeSpecName "kube-api-access-xwbsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:54:23 crc kubenswrapper[4727]: I1001 12:54:23.645890 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwbsr\" (UniqueName: \"kubernetes.io/projected/8df368e0-d59d-40bd-8aea-d68ab67ba406-kube-api-access-xwbsr\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:23 crc kubenswrapper[4727]: I1001 12:54:23.819588 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-nn6ql" Oct 01 12:54:23 crc kubenswrapper[4727]: I1001 12:54:23.819632 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-nn6ql" event={"ID":"f1c4e0f3-9f27-4304-a848-6b3482161126","Type":"ContainerDied","Data":"6f5a20ec850dc2c74ed29f50da84d3d1e3a92c6acddf45ae1273c5af152d606e"} Oct 01 12:54:23 crc kubenswrapper[4727]: I1001 12:54:23.819674 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f5a20ec850dc2c74ed29f50da84d3d1e3a92c6acddf45ae1273c5af152d606e" Oct 01 12:54:23 crc kubenswrapper[4727]: I1001 12:54:23.821609 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bde180d2-4139-4068-8702-4a8b8b21ffe8","Type":"ContainerStarted","Data":"1ece483570580c93e0d8c2adadcd9bcfbe15ac51ccf7461ae9794225d5d6bb27"} Oct 01 12:54:23 crc kubenswrapper[4727]: I1001 12:54:23.823741 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b26dc55a-af6d-4797-a736-e1ad576aef99","Type":"ContainerStarted","Data":"7eb188634ab8165568e905d6fc7ec4de11c7bd3977552ea48877ad8ce9920f84"} Oct 01 12:54:23 crc kubenswrapper[4727]: I1001 12:54:23.827592 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7zj4h" event={"ID":"8df368e0-d59d-40bd-8aea-d68ab67ba406","Type":"ContainerDied","Data":"17314e1c4cf4780e0cb9ba6119f8b610cdaa568be669488d3786c922dfc435a7"} Oct 01 12:54:23 crc kubenswrapper[4727]: I1001 12:54:23.827630 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17314e1c4cf4780e0cb9ba6119f8b610cdaa568be669488d3786c922dfc435a7" Oct 01 12:54:23 crc kubenswrapper[4727]: I1001 12:54:23.827686 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7zj4h" Oct 01 12:54:23 crc kubenswrapper[4727]: I1001 12:54:23.857198 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.857179233 podStartE2EDuration="3.857179233s" podCreationTimestamp="2025-10-01 12:54:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:23.853542 +0000 UTC m=+1042.174896857" watchObservedRunningTime="2025-10-01 12:54:23.857179233 +0000 UTC m=+1042.178534070" Oct 01 12:54:23 crc kubenswrapper[4727]: I1001 12:54:23.859797 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-4jc2p" event={"ID":"586aeab7-2b38-400f-827b-6ea16b3bf9c4","Type":"ContainerDied","Data":"3302da2c5312af880008c8b660a4521b4934e0b12ae7aa4d4eb15341b85fbcbe"} Oct 01 12:54:23 crc kubenswrapper[4727]: I1001 12:54:23.860018 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3302da2c5312af880008c8b660a4521b4934e0b12ae7aa4d4eb15341b85fbcbe" Oct 01 12:54:23 crc kubenswrapper[4727]: I1001 12:54:23.859906 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-4jc2p" Oct 01 12:54:24 crc kubenswrapper[4727]: I1001 12:54:24.121838 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 12:54:24 crc kubenswrapper[4727]: I1001 12:54:24.122413 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ff4391de-b5d6-4014-961f-a00f0a8ec3c6" containerName="glance-log" containerID="cri-o://558b23bc2226bc21814dd57165cf8033e45ef38cf572e228bb1d7acd22b6c525" gracePeriod=30 Oct 01 12:54:24 crc kubenswrapper[4727]: I1001 12:54:24.122838 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ff4391de-b5d6-4014-961f-a00f0a8ec3c6" containerName="glance-httpd" containerID="cri-o://61f027a0eafca855e2f7f638c146097f7d31adbceb27f575cd21de41b6ca23fa" gracePeriod=30 Oct 01 12:54:24 crc kubenswrapper[4727]: I1001 12:54:24.870632 4727 generic.go:334] "Generic (PLEG): container finished" podID="ff4391de-b5d6-4014-961f-a00f0a8ec3c6" containerID="558b23bc2226bc21814dd57165cf8033e45ef38cf572e228bb1d7acd22b6c525" exitCode=143 Oct 01 12:54:24 crc kubenswrapper[4727]: I1001 12:54:24.870684 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ff4391de-b5d6-4014-961f-a00f0a8ec3c6","Type":"ContainerDied","Data":"558b23bc2226bc21814dd57165cf8033e45ef38cf572e228bb1d7acd22b6c525"} Oct 01 12:54:24 crc kubenswrapper[4727]: I1001 12:54:24.873949 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bde180d2-4139-4068-8702-4a8b8b21ffe8","Type":"ContainerStarted","Data":"bd35cc8efbc2230d464d3f10ab4faff3abba0b7afe69a0c521c6fe84b3618884"} Oct 01 12:54:25 crc kubenswrapper[4727]: I1001 12:54:25.247331 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 12:54:25 crc kubenswrapper[4727]: I1001 12:54:25.247615 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a6468c22-4086-4884-a045-9f77a1b459d6" containerName="glance-log" containerID="cri-o://b375ef74f3167b9d0335cc3f5c76f2d53252b723c648000543830f8c8ed1c0c1" gracePeriod=30 Oct 01 12:54:25 crc kubenswrapper[4727]: I1001 12:54:25.247695 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a6468c22-4086-4884-a045-9f77a1b459d6" containerName="glance-httpd" containerID="cri-o://d774ec4522fae7623c4cece95c3c9a034f1d2510cb6b544b8f0c9f36ced7acdc" gracePeriod=30 Oct 01 12:54:25 crc kubenswrapper[4727]: I1001 12:54:25.732736 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:25 crc kubenswrapper[4727]: I1001 12:54:25.885174 4727 generic.go:334] "Generic (PLEG): container finished" podID="a6468c22-4086-4884-a045-9f77a1b459d6" containerID="b375ef74f3167b9d0335cc3f5c76f2d53252b723c648000543830f8c8ed1c0c1" exitCode=143 Oct 01 12:54:25 crc kubenswrapper[4727]: I1001 12:54:25.885226 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a6468c22-4086-4884-a045-9f77a1b459d6","Type":"ContainerDied","Data":"b375ef74f3167b9d0335cc3f5c76f2d53252b723c648000543830f8c8ed1c0c1"} Oct 01 12:54:26 crc kubenswrapper[4727]: I1001 12:54:26.343089 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 01 12:54:26 crc kubenswrapper[4727]: I1001 12:54:26.903518 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bde180d2-4139-4068-8702-4a8b8b21ffe8","Type":"ContainerStarted","Data":"3543f20dcaa6f4ac46073569dbcdbc26ca73bb58dffd3141ffd6ed4aaf0640d1"} Oct 01 12:54:26 crc kubenswrapper[4727]: I1001 12:54:26.904468 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 12:54:26 crc kubenswrapper[4727]: I1001 12:54:26.903805 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bde180d2-4139-4068-8702-4a8b8b21ffe8" containerName="proxy-httpd" containerID="cri-o://3543f20dcaa6f4ac46073569dbcdbc26ca73bb58dffd3141ffd6ed4aaf0640d1" gracePeriod=30 Oct 01 12:54:26 crc kubenswrapper[4727]: I1001 12:54:26.903959 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bde180d2-4139-4068-8702-4a8b8b21ffe8" containerName="sg-core" containerID="cri-o://bd35cc8efbc2230d464d3f10ab4faff3abba0b7afe69a0c521c6fe84b3618884" gracePeriod=30 Oct 01 12:54:26 crc kubenswrapper[4727]: I1001 12:54:26.904069 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bde180d2-4139-4068-8702-4a8b8b21ffe8" containerName="ceilometer-notification-agent" containerID="cri-o://1ece483570580c93e0d8c2adadcd9bcfbe15ac51ccf7461ae9794225d5d6bb27" gracePeriod=30 Oct 01 12:54:26 crc kubenswrapper[4727]: I1001 12:54:26.903672 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bde180d2-4139-4068-8702-4a8b8b21ffe8" containerName="ceilometer-central-agent" containerID="cri-o://e678b2bf3d7b30718e2f15597e3f46ed6770ec16701fa6e230da0dd2ddfbbc9e" gracePeriod=30 Oct 01 12:54:26 crc kubenswrapper[4727]: I1001 12:54:26.958352 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.748609171 podStartE2EDuration="6.958324516s" podCreationTimestamp="2025-10-01 12:54:20 +0000 UTC" firstStartedPulling="2025-10-01 12:54:21.751441759 +0000 UTC m=+1040.072796596" lastFinishedPulling="2025-10-01 12:54:25.961157094 +0000 UTC m=+1044.282511941" observedRunningTime="2025-10-01 12:54:26.953427883 +0000 UTC m=+1045.274782740" watchObservedRunningTime="2025-10-01 12:54:26.958324516 +0000 UTC m=+1045.279679363" Oct 01 12:54:27 crc kubenswrapper[4727]: I1001 12:54:27.920530 4727 generic.go:334] "Generic (PLEG): container finished" podID="ff4391de-b5d6-4014-961f-a00f0a8ec3c6" containerID="61f027a0eafca855e2f7f638c146097f7d31adbceb27f575cd21de41b6ca23fa" exitCode=0 Oct 01 12:54:27 crc kubenswrapper[4727]: I1001 12:54:27.920797 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ff4391de-b5d6-4014-961f-a00f0a8ec3c6","Type":"ContainerDied","Data":"61f027a0eafca855e2f7f638c146097f7d31adbceb27f575cd21de41b6ca23fa"} Oct 01 12:54:27 crc kubenswrapper[4727]: I1001 12:54:27.928009 4727 generic.go:334] "Generic (PLEG): container finished" podID="bde180d2-4139-4068-8702-4a8b8b21ffe8" containerID="3543f20dcaa6f4ac46073569dbcdbc26ca73bb58dffd3141ffd6ed4aaf0640d1" exitCode=0 Oct 01 12:54:27 crc kubenswrapper[4727]: I1001 12:54:27.928032 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bde180d2-4139-4068-8702-4a8b8b21ffe8","Type":"ContainerDied","Data":"3543f20dcaa6f4ac46073569dbcdbc26ca73bb58dffd3141ffd6ed4aaf0640d1"} Oct 01 12:54:27 crc kubenswrapper[4727]: I1001 12:54:27.928087 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bde180d2-4139-4068-8702-4a8b8b21ffe8","Type":"ContainerDied","Data":"bd35cc8efbc2230d464d3f10ab4faff3abba0b7afe69a0c521c6fe84b3618884"} Oct 01 12:54:27 crc kubenswrapper[4727]: I1001 12:54:27.928050 4727 generic.go:334] "Generic (PLEG): container finished" podID="bde180d2-4139-4068-8702-4a8b8b21ffe8" containerID="bd35cc8efbc2230d464d3f10ab4faff3abba0b7afe69a0c521c6fe84b3618884" exitCode=2 Oct 01 12:54:27 crc kubenswrapper[4727]: I1001 12:54:27.928119 4727 generic.go:334] "Generic (PLEG): container finished" podID="bde180d2-4139-4068-8702-4a8b8b21ffe8" containerID="1ece483570580c93e0d8c2adadcd9bcfbe15ac51ccf7461ae9794225d5d6bb27" exitCode=0 Oct 01 12:54:27 crc kubenswrapper[4727]: I1001 12:54:27.928149 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bde180d2-4139-4068-8702-4a8b8b21ffe8","Type":"ContainerDied","Data":"1ece483570580c93e0d8c2adadcd9bcfbe15ac51ccf7461ae9794225d5d6bb27"} Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.062829 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.142015 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-logs\") pod \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.142061 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.142094 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-scripts\") pod \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.142130 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-internal-tls-certs\") pod \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.142152 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwtbn\" (UniqueName: \"kubernetes.io/projected/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-kube-api-access-xwtbn\") pod \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.142177 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-config-data\") pod \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.142383 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-combined-ca-bundle\") pod \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.142711 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-logs" (OuterVolumeSpecName: "logs") pod "ff4391de-b5d6-4014-961f-a00f0a8ec3c6" (UID: "ff4391de-b5d6-4014-961f-a00f0a8ec3c6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.143037 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-httpd-run\") pod \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\" (UID: \"ff4391de-b5d6-4014-961f-a00f0a8ec3c6\") " Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.143039 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ff4391de-b5d6-4014-961f-a00f0a8ec3c6" (UID: "ff4391de-b5d6-4014-961f-a00f0a8ec3c6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.143478 4727 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.143490 4727 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-logs\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.154224 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-kube-api-access-xwtbn" (OuterVolumeSpecName: "kube-api-access-xwtbn") pod "ff4391de-b5d6-4014-961f-a00f0a8ec3c6" (UID: "ff4391de-b5d6-4014-961f-a00f0a8ec3c6"). InnerVolumeSpecName "kube-api-access-xwtbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.170474 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-scripts" (OuterVolumeSpecName: "scripts") pod "ff4391de-b5d6-4014-961f-a00f0a8ec3c6" (UID: "ff4391de-b5d6-4014-961f-a00f0a8ec3c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.176903 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "ff4391de-b5d6-4014-961f-a00f0a8ec3c6" (UID: "ff4391de-b5d6-4014-961f-a00f0a8ec3c6"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.197035 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff4391de-b5d6-4014-961f-a00f0a8ec3c6" (UID: "ff4391de-b5d6-4014-961f-a00f0a8ec3c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.222505 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-config-data" (OuterVolumeSpecName: "config-data") pod "ff4391de-b5d6-4014-961f-a00f0a8ec3c6" (UID: "ff4391de-b5d6-4014-961f-a00f0a8ec3c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.230256 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ff4391de-b5d6-4014-961f-a00f0a8ec3c6" (UID: "ff4391de-b5d6-4014-961f-a00f0a8ec3c6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.244969 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.245021 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.245061 4727 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.245074 4727 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.245085 4727 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.245096 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwtbn\" (UniqueName: \"kubernetes.io/projected/ff4391de-b5d6-4014-961f-a00f0a8ec3c6-kube-api-access-xwtbn\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.266387 4727 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.346444 4727 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.386860 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5688b44d4b-ns86z" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.890178 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.941820 4727 generic.go:334] "Generic (PLEG): container finished" podID="a6468c22-4086-4884-a045-9f77a1b459d6" containerID="d774ec4522fae7623c4cece95c3c9a034f1d2510cb6b544b8f0c9f36ced7acdc" exitCode=0 Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.941981 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.942550 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a6468c22-4086-4884-a045-9f77a1b459d6","Type":"ContainerDied","Data":"d774ec4522fae7623c4cece95c3c9a034f1d2510cb6b544b8f0c9f36ced7acdc"} Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.942661 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a6468c22-4086-4884-a045-9f77a1b459d6","Type":"ContainerDied","Data":"286e311f987d344ba7b01214f5cdde355f74bb75d799de3cc2f70b4e2dcb8ef4"} Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.942697 4727 scope.go:117] "RemoveContainer" containerID="d774ec4522fae7623c4cece95c3c9a034f1d2510cb6b544b8f0c9f36ced7acdc" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.949499 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ff4391de-b5d6-4014-961f-a00f0a8ec3c6","Type":"ContainerDied","Data":"1315cd2f0f3b85753d1dec76af163179617a09853538f37b14083755a165dd3b"} Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.949603 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.958476 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6468c22-4086-4884-a045-9f77a1b459d6-public-tls-certs\") pod \"a6468c22-4086-4884-a045-9f77a1b459d6\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.958552 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6468c22-4086-4884-a045-9f77a1b459d6-combined-ca-bundle\") pod \"a6468c22-4086-4884-a045-9f77a1b459d6\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.958719 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6468c22-4086-4884-a045-9f77a1b459d6-scripts\") pod \"a6468c22-4086-4884-a045-9f77a1b459d6\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.958874 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"a6468c22-4086-4884-a045-9f77a1b459d6\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.958955 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqslh\" (UniqueName: \"kubernetes.io/projected/a6468c22-4086-4884-a045-9f77a1b459d6-kube-api-access-hqslh\") pod \"a6468c22-4086-4884-a045-9f77a1b459d6\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.959105 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6468c22-4086-4884-a045-9f77a1b459d6-config-data\") pod \"a6468c22-4086-4884-a045-9f77a1b459d6\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.959139 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6468c22-4086-4884-a045-9f77a1b459d6-logs\") pod \"a6468c22-4086-4884-a045-9f77a1b459d6\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.959176 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6468c22-4086-4884-a045-9f77a1b459d6-httpd-run\") pod \"a6468c22-4086-4884-a045-9f77a1b459d6\" (UID: \"a6468c22-4086-4884-a045-9f77a1b459d6\") " Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.959709 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6468c22-4086-4884-a045-9f77a1b459d6-logs" (OuterVolumeSpecName: "logs") pod "a6468c22-4086-4884-a045-9f77a1b459d6" (UID: "a6468c22-4086-4884-a045-9f77a1b459d6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.959851 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6468c22-4086-4884-a045-9f77a1b459d6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a6468c22-4086-4884-a045-9f77a1b459d6" (UID: "a6468c22-4086-4884-a045-9f77a1b459d6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.966418 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "a6468c22-4086-4884-a045-9f77a1b459d6" (UID: "a6468c22-4086-4884-a045-9f77a1b459d6"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.969215 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6468c22-4086-4884-a045-9f77a1b459d6-kube-api-access-hqslh" (OuterVolumeSpecName: "kube-api-access-hqslh") pod "a6468c22-4086-4884-a045-9f77a1b459d6" (UID: "a6468c22-4086-4884-a045-9f77a1b459d6"). InnerVolumeSpecName "kube-api-access-hqslh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.972181 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6468c22-4086-4884-a045-9f77a1b459d6-scripts" (OuterVolumeSpecName: "scripts") pod "a6468c22-4086-4884-a045-9f77a1b459d6" (UID: "a6468c22-4086-4884-a045-9f77a1b459d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.982647 4727 scope.go:117] "RemoveContainer" containerID="b375ef74f3167b9d0335cc3f5c76f2d53252b723c648000543830f8c8ed1c0c1" Oct 01 12:54:28 crc kubenswrapper[4727]: I1001 12:54:28.996927 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.032600 4727 scope.go:117] "RemoveContainer" containerID="d774ec4522fae7623c4cece95c3c9a034f1d2510cb6b544b8f0c9f36ced7acdc" Oct 01 12:54:29 crc kubenswrapper[4727]: E1001 12:54:29.033214 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d774ec4522fae7623c4cece95c3c9a034f1d2510cb6b544b8f0c9f36ced7acdc\": container with ID starting with d774ec4522fae7623c4cece95c3c9a034f1d2510cb6b544b8f0c9f36ced7acdc not found: ID does not exist" containerID="d774ec4522fae7623c4cece95c3c9a034f1d2510cb6b544b8f0c9f36ced7acdc" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.033317 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d774ec4522fae7623c4cece95c3c9a034f1d2510cb6b544b8f0c9f36ced7acdc"} err="failed to get container status \"d774ec4522fae7623c4cece95c3c9a034f1d2510cb6b544b8f0c9f36ced7acdc\": rpc error: code = NotFound desc = could not find container \"d774ec4522fae7623c4cece95c3c9a034f1d2510cb6b544b8f0c9f36ced7acdc\": container with ID starting with d774ec4522fae7623c4cece95c3c9a034f1d2510cb6b544b8f0c9f36ced7acdc not found: ID does not exist" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.033397 4727 scope.go:117] "RemoveContainer" containerID="b375ef74f3167b9d0335cc3f5c76f2d53252b723c648000543830f8c8ed1c0c1" Oct 01 12:54:29 crc kubenswrapper[4727]: E1001 12:54:29.033797 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b375ef74f3167b9d0335cc3f5c76f2d53252b723c648000543830f8c8ed1c0c1\": container with ID starting with b375ef74f3167b9d0335cc3f5c76f2d53252b723c648000543830f8c8ed1c0c1 not found: ID does not exist" containerID="b375ef74f3167b9d0335cc3f5c76f2d53252b723c648000543830f8c8ed1c0c1" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.033904 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b375ef74f3167b9d0335cc3f5c76f2d53252b723c648000543830f8c8ed1c0c1"} err="failed to get container status \"b375ef74f3167b9d0335cc3f5c76f2d53252b723c648000543830f8c8ed1c0c1\": rpc error: code = NotFound desc = could not find container \"b375ef74f3167b9d0335cc3f5c76f2d53252b723c648000543830f8c8ed1c0c1\": container with ID starting with b375ef74f3167b9d0335cc3f5c76f2d53252b723c648000543830f8c8ed1c0c1 not found: ID does not exist" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.033974 4727 scope.go:117] "RemoveContainer" containerID="61f027a0eafca855e2f7f638c146097f7d31adbceb27f575cd21de41b6ca23fa" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.035586 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.052896 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6468c22-4086-4884-a045-9f77a1b459d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6468c22-4086-4884-a045-9f77a1b459d6" (UID: "a6468c22-4086-4884-a045-9f77a1b459d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.053286 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 12:54:29 crc kubenswrapper[4727]: E1001 12:54:29.054126 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1c4e0f3-9f27-4304-a848-6b3482161126" containerName="mariadb-database-create" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.054217 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1c4e0f3-9f27-4304-a848-6b3482161126" containerName="mariadb-database-create" Oct 01 12:54:29 crc kubenswrapper[4727]: E1001 12:54:29.054318 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6468c22-4086-4884-a045-9f77a1b459d6" containerName="glance-httpd" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.054407 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6468c22-4086-4884-a045-9f77a1b459d6" containerName="glance-httpd" Oct 01 12:54:29 crc kubenswrapper[4727]: E1001 12:54:29.054509 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8df368e0-d59d-40bd-8aea-d68ab67ba406" containerName="mariadb-database-create" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.054595 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df368e0-d59d-40bd-8aea-d68ab67ba406" containerName="mariadb-database-create" Oct 01 12:54:29 crc kubenswrapper[4727]: E1001 12:54:29.054683 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff4391de-b5d6-4014-961f-a00f0a8ec3c6" containerName="glance-httpd" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.054764 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff4391de-b5d6-4014-961f-a00f0a8ec3c6" containerName="glance-httpd" Oct 01 12:54:29 crc kubenswrapper[4727]: E1001 12:54:29.054857 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff4391de-b5d6-4014-961f-a00f0a8ec3c6" containerName="glance-log" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.054942 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff4391de-b5d6-4014-961f-a00f0a8ec3c6" containerName="glance-log" Oct 01 12:54:29 crc kubenswrapper[4727]: E1001 12:54:29.055067 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="586aeab7-2b38-400f-827b-6ea16b3bf9c4" containerName="mariadb-database-create" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.055139 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="586aeab7-2b38-400f-827b-6ea16b3bf9c4" containerName="mariadb-database-create" Oct 01 12:54:29 crc kubenswrapper[4727]: E1001 12:54:29.055229 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6468c22-4086-4884-a045-9f77a1b459d6" containerName="glance-log" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.056378 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6468c22-4086-4884-a045-9f77a1b459d6" containerName="glance-log" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.056621 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1c4e0f3-9f27-4304-a848-6b3482161126" containerName="mariadb-database-create" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.056688 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff4391de-b5d6-4014-961f-a00f0a8ec3c6" containerName="glance-log" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.056746 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6468c22-4086-4884-a045-9f77a1b459d6" containerName="glance-log" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.056812 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="8df368e0-d59d-40bd-8aea-d68ab67ba406" containerName="mariadb-database-create" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.056955 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6468c22-4086-4884-a045-9f77a1b459d6" containerName="glance-httpd" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.057144 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="586aeab7-2b38-400f-827b-6ea16b3bf9c4" containerName="mariadb-database-create" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.057260 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff4391de-b5d6-4014-961f-a00f0a8ec3c6" containerName="glance-httpd" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.058427 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.062246 4727 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.062679 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqslh\" (UniqueName: \"kubernetes.io/projected/a6468c22-4086-4884-a045-9f77a1b459d6-kube-api-access-hqslh\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.062702 4727 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6468c22-4086-4884-a045-9f77a1b459d6-logs\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.062715 4727 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6468c22-4086-4884-a045-9f77a1b459d6-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.062747 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6468c22-4086-4884-a045-9f77a1b459d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.062759 4727 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6468c22-4086-4884-a045-9f77a1b459d6-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.069081 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.069318 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.092470 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.098171 4727 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.102740 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6468c22-4086-4884-a045-9f77a1b459d6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a6468c22-4086-4884-a045-9f77a1b459d6" (UID: "a6468c22-4086-4884-a045-9f77a1b459d6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.114891 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6468c22-4086-4884-a045-9f77a1b459d6-config-data" (OuterVolumeSpecName: "config-data") pod "a6468c22-4086-4884-a045-9f77a1b459d6" (UID: "a6468c22-4086-4884-a045-9f77a1b459d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.165457 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/325d7807-2792-4b29-bbfe-154c6af17f6d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"325d7807-2792-4b29-bbfe-154c6af17f6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.165707 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"325d7807-2792-4b29-bbfe-154c6af17f6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.165820 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/325d7807-2792-4b29-bbfe-154c6af17f6d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"325d7807-2792-4b29-bbfe-154c6af17f6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.166589 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/325d7807-2792-4b29-bbfe-154c6af17f6d-logs\") pod \"glance-default-internal-api-0\" (UID: \"325d7807-2792-4b29-bbfe-154c6af17f6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.166705 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/325d7807-2792-4b29-bbfe-154c6af17f6d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"325d7807-2792-4b29-bbfe-154c6af17f6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.166927 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnphn\" (UniqueName: \"kubernetes.io/projected/325d7807-2792-4b29-bbfe-154c6af17f6d-kube-api-access-rnphn\") pod \"glance-default-internal-api-0\" (UID: \"325d7807-2792-4b29-bbfe-154c6af17f6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.167137 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325d7807-2792-4b29-bbfe-154c6af17f6d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"325d7807-2792-4b29-bbfe-154c6af17f6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.167252 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325d7807-2792-4b29-bbfe-154c6af17f6d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"325d7807-2792-4b29-bbfe-154c6af17f6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.167386 4727 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.167464 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6468c22-4086-4884-a045-9f77a1b459d6-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.167539 4727 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6468c22-4086-4884-a045-9f77a1b459d6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.222262 4727 scope.go:117] "RemoveContainer" containerID="558b23bc2226bc21814dd57165cf8033e45ef38cf572e228bb1d7acd22b6c525" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.269348 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnphn\" (UniqueName: \"kubernetes.io/projected/325d7807-2792-4b29-bbfe-154c6af17f6d-kube-api-access-rnphn\") pod \"glance-default-internal-api-0\" (UID: \"325d7807-2792-4b29-bbfe-154c6af17f6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.269742 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325d7807-2792-4b29-bbfe-154c6af17f6d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"325d7807-2792-4b29-bbfe-154c6af17f6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.269775 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325d7807-2792-4b29-bbfe-154c6af17f6d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"325d7807-2792-4b29-bbfe-154c6af17f6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.269803 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/325d7807-2792-4b29-bbfe-154c6af17f6d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"325d7807-2792-4b29-bbfe-154c6af17f6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.269821 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"325d7807-2792-4b29-bbfe-154c6af17f6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.269849 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/325d7807-2792-4b29-bbfe-154c6af17f6d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"325d7807-2792-4b29-bbfe-154c6af17f6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.269865 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/325d7807-2792-4b29-bbfe-154c6af17f6d-logs\") pod \"glance-default-internal-api-0\" (UID: \"325d7807-2792-4b29-bbfe-154c6af17f6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.269888 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/325d7807-2792-4b29-bbfe-154c6af17f6d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"325d7807-2792-4b29-bbfe-154c6af17f6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.270358 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/325d7807-2792-4b29-bbfe-154c6af17f6d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"325d7807-2792-4b29-bbfe-154c6af17f6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.270628 4727 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"325d7807-2792-4b29-bbfe-154c6af17f6d\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.273729 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/325d7807-2792-4b29-bbfe-154c6af17f6d-logs\") pod \"glance-default-internal-api-0\" (UID: \"325d7807-2792-4b29-bbfe-154c6af17f6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.274174 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.276579 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/325d7807-2792-4b29-bbfe-154c6af17f6d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"325d7807-2792-4b29-bbfe-154c6af17f6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.277182 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/325d7807-2792-4b29-bbfe-154c6af17f6d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"325d7807-2792-4b29-bbfe-154c6af17f6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.279101 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325d7807-2792-4b29-bbfe-154c6af17f6d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"325d7807-2792-4b29-bbfe-154c6af17f6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.290741 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325d7807-2792-4b29-bbfe-154c6af17f6d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"325d7807-2792-4b29-bbfe-154c6af17f6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.302111 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.315944 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnphn\" (UniqueName: \"kubernetes.io/projected/325d7807-2792-4b29-bbfe-154c6af17f6d-kube-api-access-rnphn\") pod \"glance-default-internal-api-0\" (UID: \"325d7807-2792-4b29-bbfe-154c6af17f6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.322275 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.332350 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.342320 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.344445 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.351599 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.376957 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"325d7807-2792-4b29-bbfe-154c6af17f6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.473379 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwfbs\" (UniqueName: \"kubernetes.io/projected/3a3b47a8-5894-49fc-a9a6-9a5f9062b439-kube-api-access-xwfbs\") pod \"glance-default-external-api-0\" (UID: \"3a3b47a8-5894-49fc-a9a6-9a5f9062b439\") " pod="openstack/glance-default-external-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.473457 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3a3b47a8-5894-49fc-a9a6-9a5f9062b439-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3a3b47a8-5894-49fc-a9a6-9a5f9062b439\") " pod="openstack/glance-default-external-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.473570 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a3b47a8-5894-49fc-a9a6-9a5f9062b439-config-data\") pod \"glance-default-external-api-0\" (UID: \"3a3b47a8-5894-49fc-a9a6-9a5f9062b439\") " pod="openstack/glance-default-external-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.473603 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3b47a8-5894-49fc-a9a6-9a5f9062b439-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3a3b47a8-5894-49fc-a9a6-9a5f9062b439\") " pod="openstack/glance-default-external-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.473869 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a3b47a8-5894-49fc-a9a6-9a5f9062b439-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3a3b47a8-5894-49fc-a9a6-9a5f9062b439\") " pod="openstack/glance-default-external-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.474047 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a3b47a8-5894-49fc-a9a6-9a5f9062b439-logs\") pod \"glance-default-external-api-0\" (UID: \"3a3b47a8-5894-49fc-a9a6-9a5f9062b439\") " pod="openstack/glance-default-external-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.474111 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"3a3b47a8-5894-49fc-a9a6-9a5f9062b439\") " pod="openstack/glance-default-external-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.474141 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a3b47a8-5894-49fc-a9a6-9a5f9062b439-scripts\") pod \"glance-default-external-api-0\" (UID: \"3a3b47a8-5894-49fc-a9a6-9a5f9062b439\") " pod="openstack/glance-default-external-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.521377 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.576799 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3b47a8-5894-49fc-a9a6-9a5f9062b439-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3a3b47a8-5894-49fc-a9a6-9a5f9062b439\") " pod="openstack/glance-default-external-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.576923 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a3b47a8-5894-49fc-a9a6-9a5f9062b439-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3a3b47a8-5894-49fc-a9a6-9a5f9062b439\") " pod="openstack/glance-default-external-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.576958 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a3b47a8-5894-49fc-a9a6-9a5f9062b439-logs\") pod \"glance-default-external-api-0\" (UID: \"3a3b47a8-5894-49fc-a9a6-9a5f9062b439\") " pod="openstack/glance-default-external-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.576985 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"3a3b47a8-5894-49fc-a9a6-9a5f9062b439\") " pod="openstack/glance-default-external-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.577025 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a3b47a8-5894-49fc-a9a6-9a5f9062b439-scripts\") pod \"glance-default-external-api-0\" (UID: \"3a3b47a8-5894-49fc-a9a6-9a5f9062b439\") " pod="openstack/glance-default-external-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.577175 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwfbs\" (UniqueName: \"kubernetes.io/projected/3a3b47a8-5894-49fc-a9a6-9a5f9062b439-kube-api-access-xwfbs\") pod \"glance-default-external-api-0\" (UID: \"3a3b47a8-5894-49fc-a9a6-9a5f9062b439\") " pod="openstack/glance-default-external-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.577239 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3a3b47a8-5894-49fc-a9a6-9a5f9062b439-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3a3b47a8-5894-49fc-a9a6-9a5f9062b439\") " pod="openstack/glance-default-external-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.577407 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a3b47a8-5894-49fc-a9a6-9a5f9062b439-config-data\") pod \"glance-default-external-api-0\" (UID: \"3a3b47a8-5894-49fc-a9a6-9a5f9062b439\") " pod="openstack/glance-default-external-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.578752 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a3b47a8-5894-49fc-a9a6-9a5f9062b439-logs\") pod \"glance-default-external-api-0\" (UID: \"3a3b47a8-5894-49fc-a9a6-9a5f9062b439\") " pod="openstack/glance-default-external-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.579079 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3a3b47a8-5894-49fc-a9a6-9a5f9062b439-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3a3b47a8-5894-49fc-a9a6-9a5f9062b439\") " pod="openstack/glance-default-external-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.579256 4727 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"3a3b47a8-5894-49fc-a9a6-9a5f9062b439\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.584956 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a3b47a8-5894-49fc-a9a6-9a5f9062b439-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3a3b47a8-5894-49fc-a9a6-9a5f9062b439\") " pod="openstack/glance-default-external-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.585902 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a3b47a8-5894-49fc-a9a6-9a5f9062b439-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3a3b47a8-5894-49fc-a9a6-9a5f9062b439\") " pod="openstack/glance-default-external-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.589617 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a3b47a8-5894-49fc-a9a6-9a5f9062b439-scripts\") pod \"glance-default-external-api-0\" (UID: \"3a3b47a8-5894-49fc-a9a6-9a5f9062b439\") " pod="openstack/glance-default-external-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.591512 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a3b47a8-5894-49fc-a9a6-9a5f9062b439-config-data\") pod \"glance-default-external-api-0\" (UID: \"3a3b47a8-5894-49fc-a9a6-9a5f9062b439\") " pod="openstack/glance-default-external-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.599575 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0ba1-account-create-6qhvv"] Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.602252 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0ba1-account-create-6qhvv" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.607644 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.613494 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0ba1-account-create-6qhvv"] Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.615360 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwfbs\" (UniqueName: \"kubernetes.io/projected/3a3b47a8-5894-49fc-a9a6-9a5f9062b439-kube-api-access-xwfbs\") pod \"glance-default-external-api-0\" (UID: \"3a3b47a8-5894-49fc-a9a6-9a5f9062b439\") " pod="openstack/glance-default-external-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.621023 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"3a3b47a8-5894-49fc-a9a6-9a5f9062b439\") " pod="openstack/glance-default-external-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.700191 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.702154 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cdda-account-create-s6lzz"] Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.706867 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cdda-account-create-s6lzz" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.716349 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.748219 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cdda-account-create-s6lzz"] Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.788251 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-579tw\" (UniqueName: \"kubernetes.io/projected/146661f6-7cea-4d6c-904b-8252681753cb-kube-api-access-579tw\") pod \"nova-api-0ba1-account-create-6qhvv\" (UID: \"146661f6-7cea-4d6c-904b-8252681753cb\") " pod="openstack/nova-api-0ba1-account-create-6qhvv" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.904409 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-579tw\" (UniqueName: \"kubernetes.io/projected/146661f6-7cea-4d6c-904b-8252681753cb-kube-api-access-579tw\") pod \"nova-api-0ba1-account-create-6qhvv\" (UID: \"146661f6-7cea-4d6c-904b-8252681753cb\") " pod="openstack/nova-api-0ba1-account-create-6qhvv" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.904529 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ftwv\" (UniqueName: \"kubernetes.io/projected/dbff18ff-5109-45fb-8bbd-36e660aba31e-kube-api-access-6ftwv\") pod \"nova-cell0-cdda-account-create-s6lzz\" (UID: \"dbff18ff-5109-45fb-8bbd-36e660aba31e\") " pod="openstack/nova-cell0-cdda-account-create-s6lzz" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.907321 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-3933-account-create-b5cwg"] Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.910283 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3933-account-create-b5cwg" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.914447 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.929133 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3933-account-create-b5cwg"] Oct 01 12:54:29 crc kubenswrapper[4727]: I1001 12:54:29.938944 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-579tw\" (UniqueName: \"kubernetes.io/projected/146661f6-7cea-4d6c-904b-8252681753cb-kube-api-access-579tw\") pod \"nova-api-0ba1-account-create-6qhvv\" (UID: \"146661f6-7cea-4d6c-904b-8252681753cb\") " pod="openstack/nova-api-0ba1-account-create-6qhvv" Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.003331 4727 generic.go:334] "Generic (PLEG): container finished" podID="bde180d2-4139-4068-8702-4a8b8b21ffe8" containerID="e678b2bf3d7b30718e2f15597e3f46ed6770ec16701fa6e230da0dd2ddfbbc9e" exitCode=0 Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.003412 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bde180d2-4139-4068-8702-4a8b8b21ffe8","Type":"ContainerDied","Data":"e678b2bf3d7b30718e2f15597e3f46ed6770ec16701fa6e230da0dd2ddfbbc9e"} Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.010074 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr85w\" (UniqueName: \"kubernetes.io/projected/6c100de0-2ce7-4c60-b790-57c91a64f9c5-kube-api-access-mr85w\") pod \"nova-cell1-3933-account-create-b5cwg\" (UID: \"6c100de0-2ce7-4c60-b790-57c91a64f9c5\") " pod="openstack/nova-cell1-3933-account-create-b5cwg" Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.010209 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ftwv\" (UniqueName: \"kubernetes.io/projected/dbff18ff-5109-45fb-8bbd-36e660aba31e-kube-api-access-6ftwv\") pod \"nova-cell0-cdda-account-create-s6lzz\" (UID: \"dbff18ff-5109-45fb-8bbd-36e660aba31e\") " pod="openstack/nova-cell0-cdda-account-create-s6lzz" Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.029729 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ftwv\" (UniqueName: \"kubernetes.io/projected/dbff18ff-5109-45fb-8bbd-36e660aba31e-kube-api-access-6ftwv\") pod \"nova-cell0-cdda-account-create-s6lzz\" (UID: \"dbff18ff-5109-45fb-8bbd-36e660aba31e\") " pod="openstack/nova-cell0-cdda-account-create-s6lzz" Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.109826 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cdda-account-create-s6lzz" Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.111337 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr85w\" (UniqueName: \"kubernetes.io/projected/6c100de0-2ce7-4c60-b790-57c91a64f9c5-kube-api-access-mr85w\") pod \"nova-cell1-3933-account-create-b5cwg\" (UID: \"6c100de0-2ce7-4c60-b790-57c91a64f9c5\") " pod="openstack/nova-cell1-3933-account-create-b5cwg" Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.132782 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr85w\" (UniqueName: \"kubernetes.io/projected/6c100de0-2ce7-4c60-b790-57c91a64f9c5-kube-api-access-mr85w\") pod \"nova-cell1-3933-account-create-b5cwg\" (UID: \"6c100de0-2ce7-4c60-b790-57c91a64f9c5\") " pod="openstack/nova-cell1-3933-account-create-b5cwg" Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.239640 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0ba1-account-create-6qhvv" Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.239663 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3933-account-create-b5cwg" Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.286159 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.385372 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.406467 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6468c22-4086-4884-a045-9f77a1b459d6" path="/var/lib/kubelet/pods/a6468c22-4086-4884-a045-9f77a1b459d6/volumes" Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.407481 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff4391de-b5d6-4014-961f-a00f0a8ec3c6" path="/var/lib/kubelet/pods/ff4391de-b5d6-4014-961f-a00f0a8ec3c6/volumes" Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.497532 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.522727 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bde180d2-4139-4068-8702-4a8b8b21ffe8-combined-ca-bundle\") pod \"bde180d2-4139-4068-8702-4a8b8b21ffe8\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.522819 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bde180d2-4139-4068-8702-4a8b8b21ffe8-scripts\") pod \"bde180d2-4139-4068-8702-4a8b8b21ffe8\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.522859 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcjx2\" (UniqueName: \"kubernetes.io/projected/bde180d2-4139-4068-8702-4a8b8b21ffe8-kube-api-access-lcjx2\") pod \"bde180d2-4139-4068-8702-4a8b8b21ffe8\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.522920 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bde180d2-4139-4068-8702-4a8b8b21ffe8-run-httpd\") pod \"bde180d2-4139-4068-8702-4a8b8b21ffe8\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.523054 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bde180d2-4139-4068-8702-4a8b8b21ffe8-log-httpd\") pod \"bde180d2-4139-4068-8702-4a8b8b21ffe8\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.523107 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bde180d2-4139-4068-8702-4a8b8b21ffe8-sg-core-conf-yaml\") pod \"bde180d2-4139-4068-8702-4a8b8b21ffe8\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.523170 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bde180d2-4139-4068-8702-4a8b8b21ffe8-config-data\") pod \"bde180d2-4139-4068-8702-4a8b8b21ffe8\" (UID: \"bde180d2-4139-4068-8702-4a8b8b21ffe8\") " Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.525098 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde180d2-4139-4068-8702-4a8b8b21ffe8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bde180d2-4139-4068-8702-4a8b8b21ffe8" (UID: "bde180d2-4139-4068-8702-4a8b8b21ffe8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.525460 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde180d2-4139-4068-8702-4a8b8b21ffe8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bde180d2-4139-4068-8702-4a8b8b21ffe8" (UID: "bde180d2-4139-4068-8702-4a8b8b21ffe8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.530639 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bde180d2-4139-4068-8702-4a8b8b21ffe8-kube-api-access-lcjx2" (OuterVolumeSpecName: "kube-api-access-lcjx2") pod "bde180d2-4139-4068-8702-4a8b8b21ffe8" (UID: "bde180d2-4139-4068-8702-4a8b8b21ffe8"). InnerVolumeSpecName "kube-api-access-lcjx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.535255 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bde180d2-4139-4068-8702-4a8b8b21ffe8-scripts" (OuterVolumeSpecName: "scripts") pod "bde180d2-4139-4068-8702-4a8b8b21ffe8" (UID: "bde180d2-4139-4068-8702-4a8b8b21ffe8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.559390 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bde180d2-4139-4068-8702-4a8b8b21ffe8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bde180d2-4139-4068-8702-4a8b8b21ffe8" (UID: "bde180d2-4139-4068-8702-4a8b8b21ffe8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.625979 4727 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bde180d2-4139-4068-8702-4a8b8b21ffe8-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.626049 4727 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bde180d2-4139-4068-8702-4a8b8b21ffe8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.626061 4727 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bde180d2-4139-4068-8702-4a8b8b21ffe8-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.626070 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcjx2\" (UniqueName: \"kubernetes.io/projected/bde180d2-4139-4068-8702-4a8b8b21ffe8-kube-api-access-lcjx2\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.626080 4727 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bde180d2-4139-4068-8702-4a8b8b21ffe8-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.647125 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bde180d2-4139-4068-8702-4a8b8b21ffe8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bde180d2-4139-4068-8702-4a8b8b21ffe8" (UID: "bde180d2-4139-4068-8702-4a8b8b21ffe8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.728073 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bde180d2-4139-4068-8702-4a8b8b21ffe8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.740192 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bde180d2-4139-4068-8702-4a8b8b21ffe8-config-data" (OuterVolumeSpecName: "config-data") pod "bde180d2-4139-4068-8702-4a8b8b21ffe8" (UID: "bde180d2-4139-4068-8702-4a8b8b21ffe8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.759064 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cdda-account-create-s6lzz"] Oct 01 12:54:30 crc kubenswrapper[4727]: W1001 12:54:30.763166 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbff18ff_5109_45fb_8bbd_36e660aba31e.slice/crio-842430f2cf20507ca03aef4f0af2e747a8e5fac18f6543d8595d21df36533823 WatchSource:0}: Error finding container 842430f2cf20507ca03aef4f0af2e747a8e5fac18f6543d8595d21df36533823: Status 404 returned error can't find the container with id 842430f2cf20507ca03aef4f0af2e747a8e5fac18f6543d8595d21df36533823 Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.832219 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bde180d2-4139-4068-8702-4a8b8b21ffe8-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.946302 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3933-account-create-b5cwg"] Oct 01 12:54:30 crc kubenswrapper[4727]: I1001 12:54:30.956688 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0ba1-account-create-6qhvv"] Oct 01 12:54:30 crc kubenswrapper[4727]: W1001 12:54:30.960051 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c100de0_2ce7_4c60_b790_57c91a64f9c5.slice/crio-08601542904d166977847ef9cb46301022e31aadb227d6546e0a9135dab9e7a8 WatchSource:0}: Error finding container 08601542904d166977847ef9cb46301022e31aadb227d6546e0a9135dab9e7a8: Status 404 returned error can't find the container with id 08601542904d166977847ef9cb46301022e31aadb227d6546e0a9135dab9e7a8 Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.048116 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0ba1-account-create-6qhvv" event={"ID":"146661f6-7cea-4d6c-904b-8252681753cb","Type":"ContainerStarted","Data":"f06f6305c04dc29ca088bada6d78be70552aa2fe47d0913cee18a25db8a9ea78"} Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.049488 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3a3b47a8-5894-49fc-a9a6-9a5f9062b439","Type":"ContainerStarted","Data":"28476e8cfdaa4dee702522b36c76ac1c73b79221b3b3278d8f91b4ec36605f24"} Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.051581 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cdda-account-create-s6lzz" event={"ID":"dbff18ff-5109-45fb-8bbd-36e660aba31e","Type":"ContainerStarted","Data":"ab9ca6366baa24e6fa0c5c395427786e7df924e70ac81bb98e42eeff5dac5d9c"} Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.051611 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cdda-account-create-s6lzz" event={"ID":"dbff18ff-5109-45fb-8bbd-36e660aba31e","Type":"ContainerStarted","Data":"842430f2cf20507ca03aef4f0af2e747a8e5fac18f6543d8595d21df36533823"} Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.055557 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bde180d2-4139-4068-8702-4a8b8b21ffe8","Type":"ContainerDied","Data":"58348455aae1130e255a8849d81db58cd5e39e931b443d72bfa0e67bf55129b7"} Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.055613 4727 scope.go:117] "RemoveContainer" containerID="3543f20dcaa6f4ac46073569dbcdbc26ca73bb58dffd3141ffd6ed4aaf0640d1" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.055758 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.071216 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"325d7807-2792-4b29-bbfe-154c6af17f6d","Type":"ContainerStarted","Data":"127fc7d5a8fd0b21ff1029a53c41ff763e75886e1d79eceefbaf8d89749bcec1"} Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.075263 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3933-account-create-b5cwg" event={"ID":"6c100de0-2ce7-4c60-b790-57c91a64f9c5","Type":"ContainerStarted","Data":"08601542904d166977847ef9cb46301022e31aadb227d6546e0a9135dab9e7a8"} Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.079709 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cdda-account-create-s6lzz" podStartSLOduration=2.079693392 podStartE2EDuration="2.079693392s" podCreationTimestamp="2025-10-01 12:54:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:31.068496492 +0000 UTC m=+1049.389851349" watchObservedRunningTime="2025-10-01 12:54:31.079693392 +0000 UTC m=+1049.401048229" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.223624 4727 scope.go:117] "RemoveContainer" containerID="bd35cc8efbc2230d464d3f10ab4faff3abba0b7afe69a0c521c6fe84b3618884" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.272221 4727 scope.go:117] "RemoveContainer" containerID="1ece483570580c93e0d8c2adadcd9bcfbe15ac51ccf7461ae9794225d5d6bb27" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.278169 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.323151 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.338153 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:31 crc kubenswrapper[4727]: E1001 12:54:31.338682 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde180d2-4139-4068-8702-4a8b8b21ffe8" containerName="proxy-httpd" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.338704 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde180d2-4139-4068-8702-4a8b8b21ffe8" containerName="proxy-httpd" Oct 01 12:54:31 crc kubenswrapper[4727]: E1001 12:54:31.338720 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde180d2-4139-4068-8702-4a8b8b21ffe8" containerName="ceilometer-notification-agent" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.338726 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde180d2-4139-4068-8702-4a8b8b21ffe8" containerName="ceilometer-notification-agent" Oct 01 12:54:31 crc kubenswrapper[4727]: E1001 12:54:31.338740 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde180d2-4139-4068-8702-4a8b8b21ffe8" containerName="ceilometer-central-agent" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.338746 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde180d2-4139-4068-8702-4a8b8b21ffe8" containerName="ceilometer-central-agent" Oct 01 12:54:31 crc kubenswrapper[4727]: E1001 12:54:31.338769 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde180d2-4139-4068-8702-4a8b8b21ffe8" containerName="sg-core" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.338774 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde180d2-4139-4068-8702-4a8b8b21ffe8" containerName="sg-core" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.338977 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="bde180d2-4139-4068-8702-4a8b8b21ffe8" containerName="ceilometer-central-agent" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.338992 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="bde180d2-4139-4068-8702-4a8b8b21ffe8" containerName="sg-core" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.339106 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="bde180d2-4139-4068-8702-4a8b8b21ffe8" containerName="ceilometer-notification-agent" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.339172 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="bde180d2-4139-4068-8702-4a8b8b21ffe8" containerName="proxy-httpd" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.341191 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.341661 4727 scope.go:117] "RemoveContainer" containerID="e678b2bf3d7b30718e2f15597e3f46ed6770ec16701fa6e230da0dd2ddfbbc9e" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.343967 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.344413 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.353385 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.443802 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-log-httpd\") pod \"ceilometer-0\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " pod="openstack/ceilometer-0" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.443910 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-run-httpd\") pod \"ceilometer-0\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " pod="openstack/ceilometer-0" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.443945 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-config-data\") pod \"ceilometer-0\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " pod="openstack/ceilometer-0" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.444263 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " pod="openstack/ceilometer-0" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.444425 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbswz\" (UniqueName: \"kubernetes.io/projected/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-kube-api-access-nbswz\") pod \"ceilometer-0\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " pod="openstack/ceilometer-0" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.444721 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-scripts\") pod \"ceilometer-0\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " pod="openstack/ceilometer-0" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.444907 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " pod="openstack/ceilometer-0" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.547139 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-run-httpd\") pod \"ceilometer-0\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " pod="openstack/ceilometer-0" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.547652 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-config-data\") pod \"ceilometer-0\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " pod="openstack/ceilometer-0" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.547836 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " pod="openstack/ceilometer-0" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.547931 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbswz\" (UniqueName: \"kubernetes.io/projected/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-kube-api-access-nbswz\") pod \"ceilometer-0\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " pod="openstack/ceilometer-0" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.548025 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-scripts\") pod \"ceilometer-0\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " pod="openstack/ceilometer-0" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.548152 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " pod="openstack/ceilometer-0" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.547842 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-run-httpd\") pod \"ceilometer-0\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " pod="openstack/ceilometer-0" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.548433 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-log-httpd\") pod \"ceilometer-0\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " pod="openstack/ceilometer-0" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.550415 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-log-httpd\") pod \"ceilometer-0\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " pod="openstack/ceilometer-0" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.554225 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " pod="openstack/ceilometer-0" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.555276 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-scripts\") pod \"ceilometer-0\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " pod="openstack/ceilometer-0" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.557592 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " pod="openstack/ceilometer-0" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.566670 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-config-data\") pod \"ceilometer-0\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " pod="openstack/ceilometer-0" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.570778 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbswz\" (UniqueName: \"kubernetes.io/projected/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-kube-api-access-nbswz\") pod \"ceilometer-0\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " pod="openstack/ceilometer-0" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.608291 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 01 12:54:31 crc kubenswrapper[4727]: I1001 12:54:31.681493 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:54:32 crc kubenswrapper[4727]: I1001 12:54:32.085145 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3a3b47a8-5894-49fc-a9a6-9a5f9062b439","Type":"ContainerStarted","Data":"968a3bad8e13900177d30635c8bf94b633f8e0909e7b7b646d97feb5a3d79b03"} Oct 01 12:54:32 crc kubenswrapper[4727]: I1001 12:54:32.088038 4727 generic.go:334] "Generic (PLEG): container finished" podID="dbff18ff-5109-45fb-8bbd-36e660aba31e" containerID="ab9ca6366baa24e6fa0c5c395427786e7df924e70ac81bb98e42eeff5dac5d9c" exitCode=0 Oct 01 12:54:32 crc kubenswrapper[4727]: I1001 12:54:32.088107 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cdda-account-create-s6lzz" event={"ID":"dbff18ff-5109-45fb-8bbd-36e660aba31e","Type":"ContainerDied","Data":"ab9ca6366baa24e6fa0c5c395427786e7df924e70ac81bb98e42eeff5dac5d9c"} Oct 01 12:54:32 crc kubenswrapper[4727]: I1001 12:54:32.095387 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"325d7807-2792-4b29-bbfe-154c6af17f6d","Type":"ContainerStarted","Data":"9855e44bd453c938c53c551e97f17e2d951bbba46eefa772e93f2ce0d707dc2d"} Oct 01 12:54:32 crc kubenswrapper[4727]: I1001 12:54:32.097234 4727 generic.go:334] "Generic (PLEG): container finished" podID="6c100de0-2ce7-4c60-b790-57c91a64f9c5" containerID="4b46339c373e1e26afad82a49874f20909c298d40daa758f5a3425ed8d2f36cd" exitCode=0 Oct 01 12:54:32 crc kubenswrapper[4727]: I1001 12:54:32.097265 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3933-account-create-b5cwg" event={"ID":"6c100de0-2ce7-4c60-b790-57c91a64f9c5","Type":"ContainerDied","Data":"4b46339c373e1e26afad82a49874f20909c298d40daa758f5a3425ed8d2f36cd"} Oct 01 12:54:32 crc kubenswrapper[4727]: I1001 12:54:32.098966 4727 generic.go:334] "Generic (PLEG): container finished" podID="146661f6-7cea-4d6c-904b-8252681753cb" containerID="e09eda7289627cf45bd768c441c1c47cf62ff7b48e9a43d0bd05645c67d25608" exitCode=0 Oct 01 12:54:32 crc kubenswrapper[4727]: I1001 12:54:32.099180 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0ba1-account-create-6qhvv" event={"ID":"146661f6-7cea-4d6c-904b-8252681753cb","Type":"ContainerDied","Data":"e09eda7289627cf45bd768c441c1c47cf62ff7b48e9a43d0bd05645c67d25608"} Oct 01 12:54:32 crc kubenswrapper[4727]: I1001 12:54:32.256007 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:32 crc kubenswrapper[4727]: I1001 12:54:32.387276 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bde180d2-4139-4068-8702-4a8b8b21ffe8" path="/var/lib/kubelet/pods/bde180d2-4139-4068-8702-4a8b8b21ffe8/volumes" Oct 01 12:54:33 crc kubenswrapper[4727]: I1001 12:54:33.112060 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc","Type":"ContainerStarted","Data":"0101c2bed508a10962a5c217f11476aa2d38bbe2e7d8de5e049287e3d2e86e98"} Oct 01 12:54:33 crc kubenswrapper[4727]: I1001 12:54:33.112427 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc","Type":"ContainerStarted","Data":"00062a61123bccc63d8cfc79150527d494bcd30b1939e921ccffb4e01b92a2ed"} Oct 01 12:54:33 crc kubenswrapper[4727]: I1001 12:54:33.115979 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"325d7807-2792-4b29-bbfe-154c6af17f6d","Type":"ContainerStarted","Data":"d3a58ea1a47170022209810a46cda77af216cfa3299a2f8be925c77e11bd7d7d"} Oct 01 12:54:33 crc kubenswrapper[4727]: I1001 12:54:33.119604 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3a3b47a8-5894-49fc-a9a6-9a5f9062b439","Type":"ContainerStarted","Data":"459b1a35ab2aa8aa5541354bd93ded2a3d8c22e89955475de2d7589d510e12a7"} Oct 01 12:54:33 crc kubenswrapper[4727]: I1001 12:54:33.143680 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.143659132 podStartE2EDuration="5.143659132s" podCreationTimestamp="2025-10-01 12:54:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:33.138326736 +0000 UTC m=+1051.459681573" watchObservedRunningTime="2025-10-01 12:54:33.143659132 +0000 UTC m=+1051.465013969" Oct 01 12:54:33 crc kubenswrapper[4727]: I1001 12:54:33.171545 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.171523682 podStartE2EDuration="4.171523682s" podCreationTimestamp="2025-10-01 12:54:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:54:33.163330867 +0000 UTC m=+1051.484685724" watchObservedRunningTime="2025-10-01 12:54:33.171523682 +0000 UTC m=+1051.492878519" Oct 01 12:54:33 crc kubenswrapper[4727]: I1001 12:54:33.857873 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cdda-account-create-s6lzz" Oct 01 12:54:33 crc kubenswrapper[4727]: I1001 12:54:33.874540 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0ba1-account-create-6qhvv" Oct 01 12:54:33 crc kubenswrapper[4727]: I1001 12:54:33.879176 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3933-account-create-b5cwg" Oct 01 12:54:34 crc kubenswrapper[4727]: I1001 12:54:34.004232 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-579tw\" (UniqueName: \"kubernetes.io/projected/146661f6-7cea-4d6c-904b-8252681753cb-kube-api-access-579tw\") pod \"146661f6-7cea-4d6c-904b-8252681753cb\" (UID: \"146661f6-7cea-4d6c-904b-8252681753cb\") " Oct 01 12:54:34 crc kubenswrapper[4727]: I1001 12:54:34.004375 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ftwv\" (UniqueName: \"kubernetes.io/projected/dbff18ff-5109-45fb-8bbd-36e660aba31e-kube-api-access-6ftwv\") pod \"dbff18ff-5109-45fb-8bbd-36e660aba31e\" (UID: \"dbff18ff-5109-45fb-8bbd-36e660aba31e\") " Oct 01 12:54:34 crc kubenswrapper[4727]: I1001 12:54:34.004418 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr85w\" (UniqueName: \"kubernetes.io/projected/6c100de0-2ce7-4c60-b790-57c91a64f9c5-kube-api-access-mr85w\") pod \"6c100de0-2ce7-4c60-b790-57c91a64f9c5\" (UID: \"6c100de0-2ce7-4c60-b790-57c91a64f9c5\") " Oct 01 12:54:34 crc kubenswrapper[4727]: I1001 12:54:34.011281 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/146661f6-7cea-4d6c-904b-8252681753cb-kube-api-access-579tw" (OuterVolumeSpecName: "kube-api-access-579tw") pod "146661f6-7cea-4d6c-904b-8252681753cb" (UID: "146661f6-7cea-4d6c-904b-8252681753cb"). InnerVolumeSpecName "kube-api-access-579tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:54:34 crc kubenswrapper[4727]: I1001 12:54:34.011682 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbff18ff-5109-45fb-8bbd-36e660aba31e-kube-api-access-6ftwv" (OuterVolumeSpecName: "kube-api-access-6ftwv") pod "dbff18ff-5109-45fb-8bbd-36e660aba31e" (UID: "dbff18ff-5109-45fb-8bbd-36e660aba31e"). InnerVolumeSpecName "kube-api-access-6ftwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:54:34 crc kubenswrapper[4727]: I1001 12:54:34.011785 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c100de0-2ce7-4c60-b790-57c91a64f9c5-kube-api-access-mr85w" (OuterVolumeSpecName: "kube-api-access-mr85w") pod "6c100de0-2ce7-4c60-b790-57c91a64f9c5" (UID: "6c100de0-2ce7-4c60-b790-57c91a64f9c5"). InnerVolumeSpecName "kube-api-access-mr85w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:54:34 crc kubenswrapper[4727]: I1001 12:54:34.107245 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ftwv\" (UniqueName: \"kubernetes.io/projected/dbff18ff-5109-45fb-8bbd-36e660aba31e-kube-api-access-6ftwv\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:34 crc kubenswrapper[4727]: I1001 12:54:34.107282 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mr85w\" (UniqueName: \"kubernetes.io/projected/6c100de0-2ce7-4c60-b790-57c91a64f9c5-kube-api-access-mr85w\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:34 crc kubenswrapper[4727]: I1001 12:54:34.107292 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-579tw\" (UniqueName: \"kubernetes.io/projected/146661f6-7cea-4d6c-904b-8252681753cb-kube-api-access-579tw\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:34 crc kubenswrapper[4727]: I1001 12:54:34.130239 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3933-account-create-b5cwg" event={"ID":"6c100de0-2ce7-4c60-b790-57c91a64f9c5","Type":"ContainerDied","Data":"08601542904d166977847ef9cb46301022e31aadb227d6546e0a9135dab9e7a8"} Oct 01 12:54:34 crc kubenswrapper[4727]: I1001 12:54:34.130277 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08601542904d166977847ef9cb46301022e31aadb227d6546e0a9135dab9e7a8" Oct 01 12:54:34 crc kubenswrapper[4727]: I1001 12:54:34.130330 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3933-account-create-b5cwg" Oct 01 12:54:34 crc kubenswrapper[4727]: I1001 12:54:34.138389 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0ba1-account-create-6qhvv" event={"ID":"146661f6-7cea-4d6c-904b-8252681753cb","Type":"ContainerDied","Data":"f06f6305c04dc29ca088bada6d78be70552aa2fe47d0913cee18a25db8a9ea78"} Oct 01 12:54:34 crc kubenswrapper[4727]: I1001 12:54:34.138460 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f06f6305c04dc29ca088bada6d78be70552aa2fe47d0913cee18a25db8a9ea78" Oct 01 12:54:34 crc kubenswrapper[4727]: I1001 12:54:34.138418 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0ba1-account-create-6qhvv" Oct 01 12:54:34 crc kubenswrapper[4727]: I1001 12:54:34.140340 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc","Type":"ContainerStarted","Data":"55476a6c0d70341e1d3b99d2c5913b5e89a9bab5ae372957e17b85a31812225b"} Oct 01 12:54:34 crc kubenswrapper[4727]: I1001 12:54:34.142824 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cdda-account-create-s6lzz" event={"ID":"dbff18ff-5109-45fb-8bbd-36e660aba31e","Type":"ContainerDied","Data":"842430f2cf20507ca03aef4f0af2e747a8e5fac18f6543d8595d21df36533823"} Oct 01 12:54:34 crc kubenswrapper[4727]: I1001 12:54:34.142877 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="842430f2cf20507ca03aef4f0af2e747a8e5fac18f6543d8595d21df36533823" Oct 01 12:54:34 crc kubenswrapper[4727]: I1001 12:54:34.143048 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cdda-account-create-s6lzz" Oct 01 12:54:34 crc kubenswrapper[4727]: I1001 12:54:34.413874 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-75b5d456dc-grn5w" Oct 01 12:54:34 crc kubenswrapper[4727]: I1001 12:54:34.480601 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5688b44d4b-ns86z"] Oct 01 12:54:34 crc kubenswrapper[4727]: I1001 12:54:34.481484 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5688b44d4b-ns86z" podUID="53c63404-aa0e-4a37-9aaf-f75e8c50831a" containerName="neutron-api" containerID="cri-o://57d06a2d1470dd4d537ba205ae76ac84eb026f3337bb1a1074280a6153dadffb" gracePeriod=30 Oct 01 12:54:34 crc kubenswrapper[4727]: I1001 12:54:34.481731 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5688b44d4b-ns86z" podUID="53c63404-aa0e-4a37-9aaf-f75e8c50831a" containerName="neutron-httpd" containerID="cri-o://6e35902325bbfbc610695e4b063a45caa0e78489199df5def2be80f434d7d0ca" gracePeriod=30 Oct 01 12:54:35 crc kubenswrapper[4727]: I1001 12:54:35.154208 4727 generic.go:334] "Generic (PLEG): container finished" podID="53c63404-aa0e-4a37-9aaf-f75e8c50831a" containerID="6e35902325bbfbc610695e4b063a45caa0e78489199df5def2be80f434d7d0ca" exitCode=0 Oct 01 12:54:35 crc kubenswrapper[4727]: I1001 12:54:35.154314 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5688b44d4b-ns86z" event={"ID":"53c63404-aa0e-4a37-9aaf-f75e8c50831a","Type":"ContainerDied","Data":"6e35902325bbfbc610695e4b063a45caa0e78489199df5def2be80f434d7d0ca"} Oct 01 12:54:35 crc kubenswrapper[4727]: I1001 12:54:35.156476 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc","Type":"ContainerStarted","Data":"dcc06df31db4dfe9a64ca90cbfdb08fe0238d18fe11868db543943145644d26d"} Oct 01 12:54:37 crc kubenswrapper[4727]: I1001 12:54:37.180122 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc","Type":"ContainerStarted","Data":"bf274f6c57358f58ed5ec7a2769d661b380800e761a5ba2aed6b701753ddcb63"} Oct 01 12:54:37 crc kubenswrapper[4727]: I1001 12:54:37.181717 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 12:54:37 crc kubenswrapper[4727]: I1001 12:54:37.222262 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.764509737 podStartE2EDuration="6.22223463s" podCreationTimestamp="2025-10-01 12:54:31 +0000 UTC" firstStartedPulling="2025-10-01 12:54:32.270484563 +0000 UTC m=+1050.591839400" lastFinishedPulling="2025-10-01 12:54:36.728209456 +0000 UTC m=+1055.049564293" observedRunningTime="2025-10-01 12:54:37.21166083 +0000 UTC m=+1055.533015687" watchObservedRunningTime="2025-10-01 12:54:37.22223463 +0000 UTC m=+1055.543589467" Oct 01 12:54:38 crc kubenswrapper[4727]: I1001 12:54:38.198308 4727 generic.go:334] "Generic (PLEG): container finished" podID="53c63404-aa0e-4a37-9aaf-f75e8c50831a" containerID="57d06a2d1470dd4d537ba205ae76ac84eb026f3337bb1a1074280a6153dadffb" exitCode=0 Oct 01 12:54:38 crc kubenswrapper[4727]: I1001 12:54:38.199071 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5688b44d4b-ns86z" event={"ID":"53c63404-aa0e-4a37-9aaf-f75e8c50831a","Type":"ContainerDied","Data":"57d06a2d1470dd4d537ba205ae76ac84eb026f3337bb1a1074280a6153dadffb"} Oct 01 12:54:38 crc kubenswrapper[4727]: I1001 12:54:38.591890 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5688b44d4b-ns86z" Oct 01 12:54:38 crc kubenswrapper[4727]: I1001 12:54:38.696303 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c63404-aa0e-4a37-9aaf-f75e8c50831a-combined-ca-bundle\") pod \"53c63404-aa0e-4a37-9aaf-f75e8c50831a\" (UID: \"53c63404-aa0e-4a37-9aaf-f75e8c50831a\") " Oct 01 12:54:38 crc kubenswrapper[4727]: I1001 12:54:38.696650 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/53c63404-aa0e-4a37-9aaf-f75e8c50831a-config\") pod \"53c63404-aa0e-4a37-9aaf-f75e8c50831a\" (UID: \"53c63404-aa0e-4a37-9aaf-f75e8c50831a\") " Oct 01 12:54:38 crc kubenswrapper[4727]: I1001 12:54:38.696691 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwxzr\" (UniqueName: \"kubernetes.io/projected/53c63404-aa0e-4a37-9aaf-f75e8c50831a-kube-api-access-vwxzr\") pod \"53c63404-aa0e-4a37-9aaf-f75e8c50831a\" (UID: \"53c63404-aa0e-4a37-9aaf-f75e8c50831a\") " Oct 01 12:54:38 crc kubenswrapper[4727]: I1001 12:54:38.696722 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/53c63404-aa0e-4a37-9aaf-f75e8c50831a-ovndb-tls-certs\") pod \"53c63404-aa0e-4a37-9aaf-f75e8c50831a\" (UID: \"53c63404-aa0e-4a37-9aaf-f75e8c50831a\") " Oct 01 12:54:38 crc kubenswrapper[4727]: I1001 12:54:38.696944 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/53c63404-aa0e-4a37-9aaf-f75e8c50831a-httpd-config\") pod \"53c63404-aa0e-4a37-9aaf-f75e8c50831a\" (UID: \"53c63404-aa0e-4a37-9aaf-f75e8c50831a\") " Oct 01 12:54:38 crc kubenswrapper[4727]: I1001 12:54:38.705276 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53c63404-aa0e-4a37-9aaf-f75e8c50831a-kube-api-access-vwxzr" (OuterVolumeSpecName: "kube-api-access-vwxzr") pod "53c63404-aa0e-4a37-9aaf-f75e8c50831a" (UID: "53c63404-aa0e-4a37-9aaf-f75e8c50831a"). InnerVolumeSpecName "kube-api-access-vwxzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:54:38 crc kubenswrapper[4727]: I1001 12:54:38.708167 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53c63404-aa0e-4a37-9aaf-f75e8c50831a-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "53c63404-aa0e-4a37-9aaf-f75e8c50831a" (UID: "53c63404-aa0e-4a37-9aaf-f75e8c50831a"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:38 crc kubenswrapper[4727]: I1001 12:54:38.760206 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53c63404-aa0e-4a37-9aaf-f75e8c50831a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53c63404-aa0e-4a37-9aaf-f75e8c50831a" (UID: "53c63404-aa0e-4a37-9aaf-f75e8c50831a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:38 crc kubenswrapper[4727]: I1001 12:54:38.770278 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53c63404-aa0e-4a37-9aaf-f75e8c50831a-config" (OuterVolumeSpecName: "config") pod "53c63404-aa0e-4a37-9aaf-f75e8c50831a" (UID: "53c63404-aa0e-4a37-9aaf-f75e8c50831a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:38 crc kubenswrapper[4727]: I1001 12:54:38.792974 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53c63404-aa0e-4a37-9aaf-f75e8c50831a-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "53c63404-aa0e-4a37-9aaf-f75e8c50831a" (UID: "53c63404-aa0e-4a37-9aaf-f75e8c50831a"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:38 crc kubenswrapper[4727]: I1001 12:54:38.799391 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c63404-aa0e-4a37-9aaf-f75e8c50831a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:38 crc kubenswrapper[4727]: I1001 12:54:38.799605 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/53c63404-aa0e-4a37-9aaf-f75e8c50831a-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:38 crc kubenswrapper[4727]: I1001 12:54:38.799689 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwxzr\" (UniqueName: \"kubernetes.io/projected/53c63404-aa0e-4a37-9aaf-f75e8c50831a-kube-api-access-vwxzr\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:38 crc kubenswrapper[4727]: I1001 12:54:38.799777 4727 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/53c63404-aa0e-4a37-9aaf-f75e8c50831a-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:38 crc kubenswrapper[4727]: I1001 12:54:38.799853 4727 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/53c63404-aa0e-4a37-9aaf-f75e8c50831a-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.214587 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5688b44d4b-ns86z" event={"ID":"53c63404-aa0e-4a37-9aaf-f75e8c50831a","Type":"ContainerDied","Data":"c4eb480b7d3e0d2bc049bb91a3f646d1c71fa7dfaa97487e3aa3ad256206e652"} Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.214641 4727 scope.go:117] "RemoveContainer" containerID="6e35902325bbfbc610695e4b063a45caa0e78489199df5def2be80f434d7d0ca" Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.215303 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5688b44d4b-ns86z" Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.251698 4727 scope.go:117] "RemoveContainer" containerID="57d06a2d1470dd4d537ba205ae76ac84eb026f3337bb1a1074280a6153dadffb" Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.259990 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5688b44d4b-ns86z"] Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.268215 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5688b44d4b-ns86z"] Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.522639 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.522692 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.558539 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.563339 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.702309 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.702585 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.736924 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.748139 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.944908 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4nf6w"] Oct 01 12:54:39 crc kubenswrapper[4727]: E1001 12:54:39.945330 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53c63404-aa0e-4a37-9aaf-f75e8c50831a" containerName="neutron-api" Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.945351 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="53c63404-aa0e-4a37-9aaf-f75e8c50831a" containerName="neutron-api" Oct 01 12:54:39 crc kubenswrapper[4727]: E1001 12:54:39.945367 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbff18ff-5109-45fb-8bbd-36e660aba31e" containerName="mariadb-account-create" Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.945373 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbff18ff-5109-45fb-8bbd-36e660aba31e" containerName="mariadb-account-create" Oct 01 12:54:39 crc kubenswrapper[4727]: E1001 12:54:39.945382 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146661f6-7cea-4d6c-904b-8252681753cb" containerName="mariadb-account-create" Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.945389 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="146661f6-7cea-4d6c-904b-8252681753cb" containerName="mariadb-account-create" Oct 01 12:54:39 crc kubenswrapper[4727]: E1001 12:54:39.945409 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53c63404-aa0e-4a37-9aaf-f75e8c50831a" containerName="neutron-httpd" Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.945414 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="53c63404-aa0e-4a37-9aaf-f75e8c50831a" containerName="neutron-httpd" Oct 01 12:54:39 crc kubenswrapper[4727]: E1001 12:54:39.945427 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c100de0-2ce7-4c60-b790-57c91a64f9c5" containerName="mariadb-account-create" Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.945432 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c100de0-2ce7-4c60-b790-57c91a64f9c5" containerName="mariadb-account-create" Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.945593 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbff18ff-5109-45fb-8bbd-36e660aba31e" containerName="mariadb-account-create" Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.945604 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="146661f6-7cea-4d6c-904b-8252681753cb" containerName="mariadb-account-create" Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.945619 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="53c63404-aa0e-4a37-9aaf-f75e8c50831a" containerName="neutron-httpd" Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.945630 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="53c63404-aa0e-4a37-9aaf-f75e8c50831a" containerName="neutron-api" Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.945640 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c100de0-2ce7-4c60-b790-57c91a64f9c5" containerName="mariadb-account-create" Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.946217 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4nf6w" Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.949447 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.949544 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.949682 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qmmth" Oct 01 12:54:39 crc kubenswrapper[4727]: I1001 12:54:39.967016 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4nf6w"] Oct 01 12:54:40 crc kubenswrapper[4727]: I1001 12:54:40.031244 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bl27\" (UniqueName: \"kubernetes.io/projected/b20c7bb1-8c02-4415-9439-1d35d550b644-kube-api-access-9bl27\") pod \"nova-cell0-conductor-db-sync-4nf6w\" (UID: \"b20c7bb1-8c02-4415-9439-1d35d550b644\") " pod="openstack/nova-cell0-conductor-db-sync-4nf6w" Oct 01 12:54:40 crc kubenswrapper[4727]: I1001 12:54:40.031358 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b20c7bb1-8c02-4415-9439-1d35d550b644-scripts\") pod \"nova-cell0-conductor-db-sync-4nf6w\" (UID: \"b20c7bb1-8c02-4415-9439-1d35d550b644\") " pod="openstack/nova-cell0-conductor-db-sync-4nf6w" Oct 01 12:54:40 crc kubenswrapper[4727]: I1001 12:54:40.031405 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b20c7bb1-8c02-4415-9439-1d35d550b644-config-data\") pod \"nova-cell0-conductor-db-sync-4nf6w\" (UID: \"b20c7bb1-8c02-4415-9439-1d35d550b644\") " pod="openstack/nova-cell0-conductor-db-sync-4nf6w" Oct 01 12:54:40 crc kubenswrapper[4727]: I1001 12:54:40.031449 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b20c7bb1-8c02-4415-9439-1d35d550b644-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4nf6w\" (UID: \"b20c7bb1-8c02-4415-9439-1d35d550b644\") " pod="openstack/nova-cell0-conductor-db-sync-4nf6w" Oct 01 12:54:40 crc kubenswrapper[4727]: I1001 12:54:40.133403 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b20c7bb1-8c02-4415-9439-1d35d550b644-scripts\") pod \"nova-cell0-conductor-db-sync-4nf6w\" (UID: \"b20c7bb1-8c02-4415-9439-1d35d550b644\") " pod="openstack/nova-cell0-conductor-db-sync-4nf6w" Oct 01 12:54:40 crc kubenswrapper[4727]: I1001 12:54:40.133503 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b20c7bb1-8c02-4415-9439-1d35d550b644-config-data\") pod \"nova-cell0-conductor-db-sync-4nf6w\" (UID: \"b20c7bb1-8c02-4415-9439-1d35d550b644\") " pod="openstack/nova-cell0-conductor-db-sync-4nf6w" Oct 01 12:54:40 crc kubenswrapper[4727]: I1001 12:54:40.133572 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b20c7bb1-8c02-4415-9439-1d35d550b644-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4nf6w\" (UID: \"b20c7bb1-8c02-4415-9439-1d35d550b644\") " pod="openstack/nova-cell0-conductor-db-sync-4nf6w" Oct 01 12:54:40 crc kubenswrapper[4727]: I1001 12:54:40.133635 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bl27\" (UniqueName: \"kubernetes.io/projected/b20c7bb1-8c02-4415-9439-1d35d550b644-kube-api-access-9bl27\") pod \"nova-cell0-conductor-db-sync-4nf6w\" (UID: \"b20c7bb1-8c02-4415-9439-1d35d550b644\") " pod="openstack/nova-cell0-conductor-db-sync-4nf6w" Oct 01 12:54:40 crc kubenswrapper[4727]: I1001 12:54:40.138411 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b20c7bb1-8c02-4415-9439-1d35d550b644-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4nf6w\" (UID: \"b20c7bb1-8c02-4415-9439-1d35d550b644\") " pod="openstack/nova-cell0-conductor-db-sync-4nf6w" Oct 01 12:54:40 crc kubenswrapper[4727]: I1001 12:54:40.138931 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b20c7bb1-8c02-4415-9439-1d35d550b644-scripts\") pod \"nova-cell0-conductor-db-sync-4nf6w\" (UID: \"b20c7bb1-8c02-4415-9439-1d35d550b644\") " pod="openstack/nova-cell0-conductor-db-sync-4nf6w" Oct 01 12:54:40 crc kubenswrapper[4727]: I1001 12:54:40.153280 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b20c7bb1-8c02-4415-9439-1d35d550b644-config-data\") pod \"nova-cell0-conductor-db-sync-4nf6w\" (UID: \"b20c7bb1-8c02-4415-9439-1d35d550b644\") " pod="openstack/nova-cell0-conductor-db-sync-4nf6w" Oct 01 12:54:40 crc kubenswrapper[4727]: I1001 12:54:40.157573 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bl27\" (UniqueName: \"kubernetes.io/projected/b20c7bb1-8c02-4415-9439-1d35d550b644-kube-api-access-9bl27\") pod \"nova-cell0-conductor-db-sync-4nf6w\" (UID: \"b20c7bb1-8c02-4415-9439-1d35d550b644\") " pod="openstack/nova-cell0-conductor-db-sync-4nf6w" Oct 01 12:54:40 crc kubenswrapper[4727]: I1001 12:54:40.225741 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 12:54:40 crc kubenswrapper[4727]: I1001 12:54:40.225804 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 12:54:40 crc kubenswrapper[4727]: I1001 12:54:40.226266 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 01 12:54:40 crc kubenswrapper[4727]: I1001 12:54:40.226308 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 01 12:54:40 crc kubenswrapper[4727]: I1001 12:54:40.265911 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4nf6w" Oct 01 12:54:40 crc kubenswrapper[4727]: I1001 12:54:40.384729 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53c63404-aa0e-4a37-9aaf-f75e8c50831a" path="/var/lib/kubelet/pods/53c63404-aa0e-4a37-9aaf-f75e8c50831a/volumes" Oct 01 12:54:40 crc kubenswrapper[4727]: I1001 12:54:40.755836 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4nf6w"] Oct 01 12:54:41 crc kubenswrapper[4727]: I1001 12:54:41.236920 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4nf6w" event={"ID":"b20c7bb1-8c02-4415-9439-1d35d550b644","Type":"ContainerStarted","Data":"11171bc5aa0fe5bbd63c6ec57af7802114cbffc1a03ba0a3c30aa8a6e571646e"} Oct 01 12:54:42 crc kubenswrapper[4727]: I1001 12:54:42.478350 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 01 12:54:42 crc kubenswrapper[4727]: I1001 12:54:42.478785 4727 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 12:54:42 crc kubenswrapper[4727]: I1001 12:54:42.573473 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 01 12:54:42 crc kubenswrapper[4727]: I1001 12:54:42.609111 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 01 12:54:42 crc kubenswrapper[4727]: I1001 12:54:42.609231 4727 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 12:54:42 crc kubenswrapper[4727]: I1001 12:54:42.615752 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 01 12:54:45 crc kubenswrapper[4727]: I1001 12:54:45.772429 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:45 crc kubenswrapper[4727]: I1001 12:54:45.773396 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" containerName="ceilometer-central-agent" containerID="cri-o://0101c2bed508a10962a5c217f11476aa2d38bbe2e7d8de5e049287e3d2e86e98" gracePeriod=30 Oct 01 12:54:45 crc kubenswrapper[4727]: I1001 12:54:45.773495 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" containerName="sg-core" containerID="cri-o://dcc06df31db4dfe9a64ca90cbfdb08fe0238d18fe11868db543943145644d26d" gracePeriod=30 Oct 01 12:54:45 crc kubenswrapper[4727]: I1001 12:54:45.773581 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" containerName="ceilometer-notification-agent" containerID="cri-o://55476a6c0d70341e1d3b99d2c5913b5e89a9bab5ae372957e17b85a31812225b" gracePeriod=30 Oct 01 12:54:45 crc kubenswrapper[4727]: I1001 12:54:45.773608 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" containerName="proxy-httpd" containerID="cri-o://bf274f6c57358f58ed5ec7a2769d661b380800e761a5ba2aed6b701753ddcb63" gracePeriod=30 Oct 01 12:54:46 crc kubenswrapper[4727]: I1001 12:54:46.301369 4727 generic.go:334] "Generic (PLEG): container finished" podID="edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" containerID="bf274f6c57358f58ed5ec7a2769d661b380800e761a5ba2aed6b701753ddcb63" exitCode=0 Oct 01 12:54:46 crc kubenswrapper[4727]: I1001 12:54:46.301421 4727 generic.go:334] "Generic (PLEG): container finished" podID="edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" containerID="dcc06df31db4dfe9a64ca90cbfdb08fe0238d18fe11868db543943145644d26d" exitCode=2 Oct 01 12:54:46 crc kubenswrapper[4727]: I1001 12:54:46.301434 4727 generic.go:334] "Generic (PLEG): container finished" podID="edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" containerID="0101c2bed508a10962a5c217f11476aa2d38bbe2e7d8de5e049287e3d2e86e98" exitCode=0 Oct 01 12:54:46 crc kubenswrapper[4727]: I1001 12:54:46.301462 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc","Type":"ContainerDied","Data":"bf274f6c57358f58ed5ec7a2769d661b380800e761a5ba2aed6b701753ddcb63"} Oct 01 12:54:46 crc kubenswrapper[4727]: I1001 12:54:46.301499 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc","Type":"ContainerDied","Data":"dcc06df31db4dfe9a64ca90cbfdb08fe0238d18fe11868db543943145644d26d"} Oct 01 12:54:46 crc kubenswrapper[4727]: I1001 12:54:46.301513 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc","Type":"ContainerDied","Data":"0101c2bed508a10962a5c217f11476aa2d38bbe2e7d8de5e049287e3d2e86e98"} Oct 01 12:54:48 crc kubenswrapper[4727]: I1001 12:54:48.327095 4727 generic.go:334] "Generic (PLEG): container finished" podID="edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" containerID="55476a6c0d70341e1d3b99d2c5913b5e89a9bab5ae372957e17b85a31812225b" exitCode=0 Oct 01 12:54:48 crc kubenswrapper[4727]: I1001 12:54:48.327163 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc","Type":"ContainerDied","Data":"55476a6c0d70341e1d3b99d2c5913b5e89a9bab5ae372957e17b85a31812225b"} Oct 01 12:54:49 crc kubenswrapper[4727]: I1001 12:54:49.432483 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:54:49 crc kubenswrapper[4727]: I1001 12:54:49.535748 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-combined-ca-bundle\") pod \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " Oct 01 12:54:49 crc kubenswrapper[4727]: I1001 12:54:49.536129 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-config-data\") pod \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " Oct 01 12:54:49 crc kubenswrapper[4727]: I1001 12:54:49.536243 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-scripts\") pod \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " Oct 01 12:54:49 crc kubenswrapper[4727]: I1001 12:54:49.536358 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-run-httpd\") pod \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " Oct 01 12:54:49 crc kubenswrapper[4727]: I1001 12:54:49.536593 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbswz\" (UniqueName: \"kubernetes.io/projected/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-kube-api-access-nbswz\") pod \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " Oct 01 12:54:49 crc kubenswrapper[4727]: I1001 12:54:49.536731 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-sg-core-conf-yaml\") pod \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " Oct 01 12:54:49 crc kubenswrapper[4727]: I1001 12:54:49.536915 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-log-httpd\") pod \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\" (UID: \"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc\") " Oct 01 12:54:49 crc kubenswrapper[4727]: I1001 12:54:49.536769 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" (UID: "edacbe44-64c3-41a2-bd8e-0b6c488a9dcc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:54:49 crc kubenswrapper[4727]: I1001 12:54:49.537326 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" (UID: "edacbe44-64c3-41a2-bd8e-0b6c488a9dcc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:54:49 crc kubenswrapper[4727]: I1001 12:54:49.537934 4727 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:49 crc kubenswrapper[4727]: I1001 12:54:49.538141 4727 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:49 crc kubenswrapper[4727]: I1001 12:54:49.543017 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-kube-api-access-nbswz" (OuterVolumeSpecName: "kube-api-access-nbswz") pod "edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" (UID: "edacbe44-64c3-41a2-bd8e-0b6c488a9dcc"). InnerVolumeSpecName "kube-api-access-nbswz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:54:49 crc kubenswrapper[4727]: I1001 12:54:49.543740 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-scripts" (OuterVolumeSpecName: "scripts") pod "edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" (UID: "edacbe44-64c3-41a2-bd8e-0b6c488a9dcc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:49 crc kubenswrapper[4727]: I1001 12:54:49.567211 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" (UID: "edacbe44-64c3-41a2-bd8e-0b6c488a9dcc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:49 crc kubenswrapper[4727]: I1001 12:54:49.623255 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" (UID: "edacbe44-64c3-41a2-bd8e-0b6c488a9dcc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:49 crc kubenswrapper[4727]: I1001 12:54:49.640742 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:49 crc kubenswrapper[4727]: I1001 12:54:49.640790 4727 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:49 crc kubenswrapper[4727]: I1001 12:54:49.640805 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbswz\" (UniqueName: \"kubernetes.io/projected/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-kube-api-access-nbswz\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:49 crc kubenswrapper[4727]: I1001 12:54:49.640822 4727 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:49 crc kubenswrapper[4727]: I1001 12:54:49.643563 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-config-data" (OuterVolumeSpecName: "config-data") pod "edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" (UID: "edacbe44-64c3-41a2-bd8e-0b6c488a9dcc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:54:49 crc kubenswrapper[4727]: I1001 12:54:49.742641 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.390337 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4nf6w" event={"ID":"b20c7bb1-8c02-4415-9439-1d35d550b644","Type":"ContainerStarted","Data":"717bf5651651b20988f784b3348033a06b65fa45e9893624d920f7e18d394d4a"} Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.398274 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edacbe44-64c3-41a2-bd8e-0b6c488a9dcc","Type":"ContainerDied","Data":"00062a61123bccc63d8cfc79150527d494bcd30b1939e921ccffb4e01b92a2ed"} Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.398394 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.399034 4727 scope.go:117] "RemoveContainer" containerID="bf274f6c57358f58ed5ec7a2769d661b380800e761a5ba2aed6b701753ddcb63" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.428181 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-4nf6w" podStartSLOduration=3.072326162 podStartE2EDuration="11.427853126s" podCreationTimestamp="2025-10-01 12:54:39 +0000 UTC" firstStartedPulling="2025-10-01 12:54:40.767380074 +0000 UTC m=+1059.088734911" lastFinishedPulling="2025-10-01 12:54:49.122907038 +0000 UTC m=+1067.444261875" observedRunningTime="2025-10-01 12:54:50.411199085 +0000 UTC m=+1068.732553942" watchObservedRunningTime="2025-10-01 12:54:50.427853126 +0000 UTC m=+1068.749207963" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.429214 4727 scope.go:117] "RemoveContainer" containerID="dcc06df31db4dfe9a64ca90cbfdb08fe0238d18fe11868db543943145644d26d" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.450694 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.462716 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.474190 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:50 crc kubenswrapper[4727]: E1001 12:54:50.478071 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" containerName="ceilometer-central-agent" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.478504 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" containerName="ceilometer-central-agent" Oct 01 12:54:50 crc kubenswrapper[4727]: E1001 12:54:50.478612 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" containerName="sg-core" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.478684 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" containerName="sg-core" Oct 01 12:54:50 crc kubenswrapper[4727]: E1001 12:54:50.478767 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" containerName="proxy-httpd" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.478824 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" containerName="proxy-httpd" Oct 01 12:54:50 crc kubenswrapper[4727]: E1001 12:54:50.478921 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" containerName="ceilometer-notification-agent" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.478978 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" containerName="ceilometer-notification-agent" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.479286 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" containerName="proxy-httpd" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.479360 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" containerName="ceilometer-central-agent" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.479437 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" containerName="sg-core" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.479530 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" containerName="ceilometer-notification-agent" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.477115 4727 scope.go:117] "RemoveContainer" containerID="55476a6c0d70341e1d3b99d2c5913b5e89a9bab5ae372957e17b85a31812225b" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.482242 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.488689 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.488831 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.508153 4727 scope.go:117] "RemoveContainer" containerID="0101c2bed508a10962a5c217f11476aa2d38bbe2e7d8de5e049287e3d2e86e98" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.511513 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.565989 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " pod="openstack/ceilometer-0" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.566130 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-run-httpd\") pod \"ceilometer-0\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " pod="openstack/ceilometer-0" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.566187 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " pod="openstack/ceilometer-0" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.566270 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-log-httpd\") pod \"ceilometer-0\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " pod="openstack/ceilometer-0" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.566311 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-scripts\") pod \"ceilometer-0\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " pod="openstack/ceilometer-0" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.566356 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-config-data\") pod \"ceilometer-0\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " pod="openstack/ceilometer-0" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.566416 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58qnz\" (UniqueName: \"kubernetes.io/projected/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-kube-api-access-58qnz\") pod \"ceilometer-0\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " pod="openstack/ceilometer-0" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.668492 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " pod="openstack/ceilometer-0" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.668580 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-run-httpd\") pod \"ceilometer-0\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " pod="openstack/ceilometer-0" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.668611 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " pod="openstack/ceilometer-0" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.668660 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-log-httpd\") pod \"ceilometer-0\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " pod="openstack/ceilometer-0" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.668684 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-scripts\") pod \"ceilometer-0\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " pod="openstack/ceilometer-0" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.668707 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-config-data\") pod \"ceilometer-0\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " pod="openstack/ceilometer-0" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.668743 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58qnz\" (UniqueName: \"kubernetes.io/projected/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-kube-api-access-58qnz\") pod \"ceilometer-0\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " pod="openstack/ceilometer-0" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.669986 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-log-httpd\") pod \"ceilometer-0\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " pod="openstack/ceilometer-0" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.671061 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-run-httpd\") pod \"ceilometer-0\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " pod="openstack/ceilometer-0" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.676153 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-scripts\") pod \"ceilometer-0\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " pod="openstack/ceilometer-0" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.678655 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " pod="openstack/ceilometer-0" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.679816 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " pod="openstack/ceilometer-0" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.688808 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-config-data\") pod \"ceilometer-0\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " pod="openstack/ceilometer-0" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.696772 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58qnz\" (UniqueName: \"kubernetes.io/projected/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-kube-api-access-58qnz\") pod \"ceilometer-0\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " pod="openstack/ceilometer-0" Oct 01 12:54:50 crc kubenswrapper[4727]: I1001 12:54:50.807279 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:54:51 crc kubenswrapper[4727]: I1001 12:54:51.287553 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:51 crc kubenswrapper[4727]: I1001 12:54:51.423852 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee","Type":"ContainerStarted","Data":"6b56ce5b508d747114ea6b9d2336df5be9b9bcc9a94ee928c7205cebc2b36bb0"} Oct 01 12:54:51 crc kubenswrapper[4727]: I1001 12:54:51.901412 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:54:52 crc kubenswrapper[4727]: I1001 12:54:52.387795 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edacbe44-64c3-41a2-bd8e-0b6c488a9dcc" path="/var/lib/kubelet/pods/edacbe44-64c3-41a2-bd8e-0b6c488a9dcc/volumes" Oct 01 12:54:52 crc kubenswrapper[4727]: I1001 12:54:52.444213 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee","Type":"ContainerStarted","Data":"bfcbba1c5358c39debd98ccd9449ffc0bafb5e46618c2d78f8dac485f86c7e9b"} Oct 01 12:54:53 crc kubenswrapper[4727]: I1001 12:54:53.457802 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee","Type":"ContainerStarted","Data":"d5037bf16e4c31dd20f34fdf8ab22a65417119201f8e2b42b7162e9ef7f38435"} Oct 01 12:54:54 crc kubenswrapper[4727]: I1001 12:54:54.472210 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee","Type":"ContainerStarted","Data":"288e4e144ffc39623154a66d7ef4537301d04c325ba1ea2092a67705821d4216"} Oct 01 12:54:55 crc kubenswrapper[4727]: I1001 12:54:55.486956 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee","Type":"ContainerStarted","Data":"6a35062585bd1e7d212a0a0d46d459c5729a57996ba20acbb7c35cd6a5423340"} Oct 01 12:54:55 crc kubenswrapper[4727]: I1001 12:54:55.487395 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" containerName="ceilometer-central-agent" containerID="cri-o://bfcbba1c5358c39debd98ccd9449ffc0bafb5e46618c2d78f8dac485f86c7e9b" gracePeriod=30 Oct 01 12:54:55 crc kubenswrapper[4727]: I1001 12:54:55.487466 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" containerName="proxy-httpd" containerID="cri-o://6a35062585bd1e7d212a0a0d46d459c5729a57996ba20acbb7c35cd6a5423340" gracePeriod=30 Oct 01 12:54:55 crc kubenswrapper[4727]: I1001 12:54:55.487543 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" containerName="ceilometer-notification-agent" containerID="cri-o://d5037bf16e4c31dd20f34fdf8ab22a65417119201f8e2b42b7162e9ef7f38435" gracePeriod=30 Oct 01 12:54:55 crc kubenswrapper[4727]: I1001 12:54:55.487547 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" containerName="sg-core" containerID="cri-o://288e4e144ffc39623154a66d7ef4537301d04c325ba1ea2092a67705821d4216" gracePeriod=30 Oct 01 12:54:55 crc kubenswrapper[4727]: I1001 12:54:55.487593 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 12:54:55 crc kubenswrapper[4727]: I1001 12:54:55.539820 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.651738921 podStartE2EDuration="5.539803257s" podCreationTimestamp="2025-10-01 12:54:50 +0000 UTC" firstStartedPulling="2025-10-01 12:54:51.295273145 +0000 UTC m=+1069.616627982" lastFinishedPulling="2025-10-01 12:54:55.183337481 +0000 UTC m=+1073.504692318" observedRunningTime="2025-10-01 12:54:55.533650435 +0000 UTC m=+1073.855005302" watchObservedRunningTime="2025-10-01 12:54:55.539803257 +0000 UTC m=+1073.861158094" Oct 01 12:54:56 crc kubenswrapper[4727]: I1001 12:54:56.499178 4727 generic.go:334] "Generic (PLEG): container finished" podID="86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" containerID="288e4e144ffc39623154a66d7ef4537301d04c325ba1ea2092a67705821d4216" exitCode=2 Oct 01 12:54:56 crc kubenswrapper[4727]: I1001 12:54:56.499446 4727 generic.go:334] "Generic (PLEG): container finished" podID="86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" containerID="d5037bf16e4c31dd20f34fdf8ab22a65417119201f8e2b42b7162e9ef7f38435" exitCode=0 Oct 01 12:54:56 crc kubenswrapper[4727]: I1001 12:54:56.499289 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee","Type":"ContainerDied","Data":"288e4e144ffc39623154a66d7ef4537301d04c325ba1ea2092a67705821d4216"} Oct 01 12:54:56 crc kubenswrapper[4727]: I1001 12:54:56.499484 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee","Type":"ContainerDied","Data":"d5037bf16e4c31dd20f34fdf8ab22a65417119201f8e2b42b7162e9ef7f38435"} Oct 01 12:55:03 crc kubenswrapper[4727]: I1001 12:55:03.562958 4727 generic.go:334] "Generic (PLEG): container finished" podID="86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" containerID="bfcbba1c5358c39debd98ccd9449ffc0bafb5e46618c2d78f8dac485f86c7e9b" exitCode=0 Oct 01 12:55:03 crc kubenswrapper[4727]: I1001 12:55:03.563015 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee","Type":"ContainerDied","Data":"bfcbba1c5358c39debd98ccd9449ffc0bafb5e46618c2d78f8dac485f86c7e9b"} Oct 01 12:55:04 crc kubenswrapper[4727]: I1001 12:55:04.574121 4727 generic.go:334] "Generic (PLEG): container finished" podID="b20c7bb1-8c02-4415-9439-1d35d550b644" containerID="717bf5651651b20988f784b3348033a06b65fa45e9893624d920f7e18d394d4a" exitCode=0 Oct 01 12:55:04 crc kubenswrapper[4727]: I1001 12:55:04.574225 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4nf6w" event={"ID":"b20c7bb1-8c02-4415-9439-1d35d550b644","Type":"ContainerDied","Data":"717bf5651651b20988f784b3348033a06b65fa45e9893624d920f7e18d394d4a"} Oct 01 12:55:05 crc kubenswrapper[4727]: I1001 12:55:05.909840 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4nf6w" Oct 01 12:55:05 crc kubenswrapper[4727]: I1001 12:55:05.987729 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bl27\" (UniqueName: \"kubernetes.io/projected/b20c7bb1-8c02-4415-9439-1d35d550b644-kube-api-access-9bl27\") pod \"b20c7bb1-8c02-4415-9439-1d35d550b644\" (UID: \"b20c7bb1-8c02-4415-9439-1d35d550b644\") " Oct 01 12:55:05 crc kubenswrapper[4727]: I1001 12:55:05.987804 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b20c7bb1-8c02-4415-9439-1d35d550b644-scripts\") pod \"b20c7bb1-8c02-4415-9439-1d35d550b644\" (UID: \"b20c7bb1-8c02-4415-9439-1d35d550b644\") " Oct 01 12:55:05 crc kubenswrapper[4727]: I1001 12:55:05.987829 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b20c7bb1-8c02-4415-9439-1d35d550b644-combined-ca-bundle\") pod \"b20c7bb1-8c02-4415-9439-1d35d550b644\" (UID: \"b20c7bb1-8c02-4415-9439-1d35d550b644\") " Oct 01 12:55:05 crc kubenswrapper[4727]: I1001 12:55:05.987952 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b20c7bb1-8c02-4415-9439-1d35d550b644-config-data\") pod \"b20c7bb1-8c02-4415-9439-1d35d550b644\" (UID: \"b20c7bb1-8c02-4415-9439-1d35d550b644\") " Oct 01 12:55:05 crc kubenswrapper[4727]: I1001 12:55:05.994133 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b20c7bb1-8c02-4415-9439-1d35d550b644-scripts" (OuterVolumeSpecName: "scripts") pod "b20c7bb1-8c02-4415-9439-1d35d550b644" (UID: "b20c7bb1-8c02-4415-9439-1d35d550b644"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:05 crc kubenswrapper[4727]: I1001 12:55:05.995258 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b20c7bb1-8c02-4415-9439-1d35d550b644-kube-api-access-9bl27" (OuterVolumeSpecName: "kube-api-access-9bl27") pod "b20c7bb1-8c02-4415-9439-1d35d550b644" (UID: "b20c7bb1-8c02-4415-9439-1d35d550b644"). InnerVolumeSpecName "kube-api-access-9bl27". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:06 crc kubenswrapper[4727]: I1001 12:55:06.016985 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b20c7bb1-8c02-4415-9439-1d35d550b644-config-data" (OuterVolumeSpecName: "config-data") pod "b20c7bb1-8c02-4415-9439-1d35d550b644" (UID: "b20c7bb1-8c02-4415-9439-1d35d550b644"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:06 crc kubenswrapper[4727]: I1001 12:55:06.023288 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b20c7bb1-8c02-4415-9439-1d35d550b644-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b20c7bb1-8c02-4415-9439-1d35d550b644" (UID: "b20c7bb1-8c02-4415-9439-1d35d550b644"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:06 crc kubenswrapper[4727]: I1001 12:55:06.090723 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b20c7bb1-8c02-4415-9439-1d35d550b644-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:06 crc kubenswrapper[4727]: I1001 12:55:06.090774 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bl27\" (UniqueName: \"kubernetes.io/projected/b20c7bb1-8c02-4415-9439-1d35d550b644-kube-api-access-9bl27\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:06 crc kubenswrapper[4727]: I1001 12:55:06.090791 4727 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b20c7bb1-8c02-4415-9439-1d35d550b644-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:06 crc kubenswrapper[4727]: I1001 12:55:06.090802 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b20c7bb1-8c02-4415-9439-1d35d550b644-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:06 crc kubenswrapper[4727]: I1001 12:55:06.593252 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4nf6w" event={"ID":"b20c7bb1-8c02-4415-9439-1d35d550b644","Type":"ContainerDied","Data":"11171bc5aa0fe5bbd63c6ec57af7802114cbffc1a03ba0a3c30aa8a6e571646e"} Oct 01 12:55:06 crc kubenswrapper[4727]: I1001 12:55:06.593299 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11171bc5aa0fe5bbd63c6ec57af7802114cbffc1a03ba0a3c30aa8a6e571646e" Oct 01 12:55:06 crc kubenswrapper[4727]: I1001 12:55:06.593331 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4nf6w" Oct 01 12:55:06 crc kubenswrapper[4727]: I1001 12:55:06.679772 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 01 12:55:06 crc kubenswrapper[4727]: E1001 12:55:06.680293 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b20c7bb1-8c02-4415-9439-1d35d550b644" containerName="nova-cell0-conductor-db-sync" Oct 01 12:55:06 crc kubenswrapper[4727]: I1001 12:55:06.680315 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20c7bb1-8c02-4415-9439-1d35d550b644" containerName="nova-cell0-conductor-db-sync" Oct 01 12:55:06 crc kubenswrapper[4727]: I1001 12:55:06.680551 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="b20c7bb1-8c02-4415-9439-1d35d550b644" containerName="nova-cell0-conductor-db-sync" Oct 01 12:55:06 crc kubenswrapper[4727]: I1001 12:55:06.681350 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 01 12:55:06 crc kubenswrapper[4727]: I1001 12:55:06.683717 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 01 12:55:06 crc kubenswrapper[4727]: I1001 12:55:06.684186 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qmmth" Oct 01 12:55:06 crc kubenswrapper[4727]: I1001 12:55:06.695787 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 01 12:55:06 crc kubenswrapper[4727]: I1001 12:55:06.816494 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72df30c3-21c3-4ff3-b799-0833159289b0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"72df30c3-21c3-4ff3-b799-0833159289b0\") " pod="openstack/nova-cell0-conductor-0" Oct 01 12:55:06 crc kubenswrapper[4727]: I1001 12:55:06.816618 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h6w9\" (UniqueName: \"kubernetes.io/projected/72df30c3-21c3-4ff3-b799-0833159289b0-kube-api-access-4h6w9\") pod \"nova-cell0-conductor-0\" (UID: \"72df30c3-21c3-4ff3-b799-0833159289b0\") " pod="openstack/nova-cell0-conductor-0" Oct 01 12:55:06 crc kubenswrapper[4727]: I1001 12:55:06.816804 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72df30c3-21c3-4ff3-b799-0833159289b0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"72df30c3-21c3-4ff3-b799-0833159289b0\") " pod="openstack/nova-cell0-conductor-0" Oct 01 12:55:06 crc kubenswrapper[4727]: I1001 12:55:06.919179 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h6w9\" (UniqueName: \"kubernetes.io/projected/72df30c3-21c3-4ff3-b799-0833159289b0-kube-api-access-4h6w9\") pod \"nova-cell0-conductor-0\" (UID: \"72df30c3-21c3-4ff3-b799-0833159289b0\") " pod="openstack/nova-cell0-conductor-0" Oct 01 12:55:06 crc kubenswrapper[4727]: I1001 12:55:06.919285 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72df30c3-21c3-4ff3-b799-0833159289b0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"72df30c3-21c3-4ff3-b799-0833159289b0\") " pod="openstack/nova-cell0-conductor-0" Oct 01 12:55:06 crc kubenswrapper[4727]: I1001 12:55:06.919357 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72df30c3-21c3-4ff3-b799-0833159289b0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"72df30c3-21c3-4ff3-b799-0833159289b0\") " pod="openstack/nova-cell0-conductor-0" Oct 01 12:55:06 crc kubenswrapper[4727]: I1001 12:55:06.923817 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72df30c3-21c3-4ff3-b799-0833159289b0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"72df30c3-21c3-4ff3-b799-0833159289b0\") " pod="openstack/nova-cell0-conductor-0" Oct 01 12:55:06 crc kubenswrapper[4727]: I1001 12:55:06.923950 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72df30c3-21c3-4ff3-b799-0833159289b0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"72df30c3-21c3-4ff3-b799-0833159289b0\") " pod="openstack/nova-cell0-conductor-0" Oct 01 12:55:06 crc kubenswrapper[4727]: I1001 12:55:06.937847 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h6w9\" (UniqueName: \"kubernetes.io/projected/72df30c3-21c3-4ff3-b799-0833159289b0-kube-api-access-4h6w9\") pod \"nova-cell0-conductor-0\" (UID: \"72df30c3-21c3-4ff3-b799-0833159289b0\") " pod="openstack/nova-cell0-conductor-0" Oct 01 12:55:06 crc kubenswrapper[4727]: I1001 12:55:06.997432 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 01 12:55:07 crc kubenswrapper[4727]: I1001 12:55:07.411070 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 01 12:55:07 crc kubenswrapper[4727]: I1001 12:55:07.604133 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"72df30c3-21c3-4ff3-b799-0833159289b0","Type":"ContainerStarted","Data":"bc8c6f3b07eed079db975dddd8cd9302c926bc3dbc8be959299ba55d232ed8c5"} Oct 01 12:55:07 crc kubenswrapper[4727]: I1001 12:55:07.604277 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 01 12:55:07 crc kubenswrapper[4727]: I1001 12:55:07.621305 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.621289035 podStartE2EDuration="1.621289035s" podCreationTimestamp="2025-10-01 12:55:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:07.619940552 +0000 UTC m=+1085.941295399" watchObservedRunningTime="2025-10-01 12:55:07.621289035 +0000 UTC m=+1085.942643872" Oct 01 12:55:08 crc kubenswrapper[4727]: I1001 12:55:08.613572 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"72df30c3-21c3-4ff3-b799-0833159289b0","Type":"ContainerStarted","Data":"8c7c5c267aa0feee2044db5ff9bfdbdb9b838ff35fb1f36c25355af19852d76e"} Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.031348 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.474963 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-wr7vr"] Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.476557 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wr7vr" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.479667 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.480369 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.484475 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-wr7vr"] Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.640917 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.642319 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.644511 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.648502 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58437054-5624-488d-a672-2eb046c0d09c-scripts\") pod \"nova-cell0-cell-mapping-wr7vr\" (UID: \"58437054-5624-488d-a672-2eb046c0d09c\") " pod="openstack/nova-cell0-cell-mapping-wr7vr" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.648649 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58437054-5624-488d-a672-2eb046c0d09c-config-data\") pod \"nova-cell0-cell-mapping-wr7vr\" (UID: \"58437054-5624-488d-a672-2eb046c0d09c\") " pod="openstack/nova-cell0-cell-mapping-wr7vr" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.648689 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6v9v\" (UniqueName: \"kubernetes.io/projected/58437054-5624-488d-a672-2eb046c0d09c-kube-api-access-v6v9v\") pod \"nova-cell0-cell-mapping-wr7vr\" (UID: \"58437054-5624-488d-a672-2eb046c0d09c\") " pod="openstack/nova-cell0-cell-mapping-wr7vr" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.648737 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58437054-5624-488d-a672-2eb046c0d09c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-wr7vr\" (UID: \"58437054-5624-488d-a672-2eb046c0d09c\") " pod="openstack/nova-cell0-cell-mapping-wr7vr" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.668096 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.685102 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.686369 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.690820 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.726395 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.750292 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhxc2\" (UniqueName: \"kubernetes.io/projected/bdce9668-f762-4b0d-b33a-76a1b270c575-kube-api-access-bhxc2\") pod \"nova-scheduler-0\" (UID: \"bdce9668-f762-4b0d-b33a-76a1b270c575\") " pod="openstack/nova-scheduler-0" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.750359 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdce9668-f762-4b0d-b33a-76a1b270c575-config-data\") pod \"nova-scheduler-0\" (UID: \"bdce9668-f762-4b0d-b33a-76a1b270c575\") " pod="openstack/nova-scheduler-0" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.750396 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58437054-5624-488d-a672-2eb046c0d09c-scripts\") pod \"nova-cell0-cell-mapping-wr7vr\" (UID: \"58437054-5624-488d-a672-2eb046c0d09c\") " pod="openstack/nova-cell0-cell-mapping-wr7vr" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.750467 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58437054-5624-488d-a672-2eb046c0d09c-config-data\") pod \"nova-cell0-cell-mapping-wr7vr\" (UID: \"58437054-5624-488d-a672-2eb046c0d09c\") " pod="openstack/nova-cell0-cell-mapping-wr7vr" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.750494 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6v9v\" (UniqueName: \"kubernetes.io/projected/58437054-5624-488d-a672-2eb046c0d09c-kube-api-access-v6v9v\") pod \"nova-cell0-cell-mapping-wr7vr\" (UID: \"58437054-5624-488d-a672-2eb046c0d09c\") " pod="openstack/nova-cell0-cell-mapping-wr7vr" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.750508 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdce9668-f762-4b0d-b33a-76a1b270c575-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bdce9668-f762-4b0d-b33a-76a1b270c575\") " pod="openstack/nova-scheduler-0" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.750540 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58437054-5624-488d-a672-2eb046c0d09c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-wr7vr\" (UID: \"58437054-5624-488d-a672-2eb046c0d09c\") " pod="openstack/nova-cell0-cell-mapping-wr7vr" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.761812 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58437054-5624-488d-a672-2eb046c0d09c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-wr7vr\" (UID: \"58437054-5624-488d-a672-2eb046c0d09c\") " pod="openstack/nova-cell0-cell-mapping-wr7vr" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.771057 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58437054-5624-488d-a672-2eb046c0d09c-scripts\") pod \"nova-cell0-cell-mapping-wr7vr\" (UID: \"58437054-5624-488d-a672-2eb046c0d09c\") " pod="openstack/nova-cell0-cell-mapping-wr7vr" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.775095 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58437054-5624-488d-a672-2eb046c0d09c-config-data\") pod \"nova-cell0-cell-mapping-wr7vr\" (UID: \"58437054-5624-488d-a672-2eb046c0d09c\") " pod="openstack/nova-cell0-cell-mapping-wr7vr" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.778649 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6v9v\" (UniqueName: \"kubernetes.io/projected/58437054-5624-488d-a672-2eb046c0d09c-kube-api-access-v6v9v\") pod \"nova-cell0-cell-mapping-wr7vr\" (UID: \"58437054-5624-488d-a672-2eb046c0d09c\") " pod="openstack/nova-cell0-cell-mapping-wr7vr" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.783219 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.785300 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.787643 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.789537 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.823763 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wr7vr" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.852863 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e55ab75-534e-4cf9-9a5c-58d5da07ad7b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e55ab75-534e-4cf9-9a5c-58d5da07ad7b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.852992 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhxc2\" (UniqueName: \"kubernetes.io/projected/bdce9668-f762-4b0d-b33a-76a1b270c575-kube-api-access-bhxc2\") pod \"nova-scheduler-0\" (UID: \"bdce9668-f762-4b0d-b33a-76a1b270c575\") " pod="openstack/nova-scheduler-0" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.853052 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e55ab75-534e-4cf9-9a5c-58d5da07ad7b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e55ab75-534e-4cf9-9a5c-58d5da07ad7b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.853106 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdce9668-f762-4b0d-b33a-76a1b270c575-config-data\") pod \"nova-scheduler-0\" (UID: \"bdce9668-f762-4b0d-b33a-76a1b270c575\") " pod="openstack/nova-scheduler-0" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.853242 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h5g4\" (UniqueName: \"kubernetes.io/projected/8e55ab75-534e-4cf9-9a5c-58d5da07ad7b-kube-api-access-2h5g4\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e55ab75-534e-4cf9-9a5c-58d5da07ad7b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.853291 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdce9668-f762-4b0d-b33a-76a1b270c575-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bdce9668-f762-4b0d-b33a-76a1b270c575\") " pod="openstack/nova-scheduler-0" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.863135 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdce9668-f762-4b0d-b33a-76a1b270c575-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bdce9668-f762-4b0d-b33a-76a1b270c575\") " pod="openstack/nova-scheduler-0" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.872691 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdce9668-f762-4b0d-b33a-76a1b270c575-config-data\") pod \"nova-scheduler-0\" (UID: \"bdce9668-f762-4b0d-b33a-76a1b270c575\") " pod="openstack/nova-scheduler-0" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.891895 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.893937 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhxc2\" (UniqueName: \"kubernetes.io/projected/bdce9668-f762-4b0d-b33a-76a1b270c575-kube-api-access-bhxc2\") pod \"nova-scheduler-0\" (UID: \"bdce9668-f762-4b0d-b33a-76a1b270c575\") " pod="openstack/nova-scheduler-0" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.895586 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.903936 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.917909 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.955094 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs9hk\" (UniqueName: \"kubernetes.io/projected/f46d91a0-8be6-46bc-a325-5b4ccc433a6d-kube-api-access-cs9hk\") pod \"nova-api-0\" (UID: \"f46d91a0-8be6-46bc-a325-5b4ccc433a6d\") " pod="openstack/nova-api-0" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.955155 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e55ab75-534e-4cf9-9a5c-58d5da07ad7b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e55ab75-534e-4cf9-9a5c-58d5da07ad7b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.955178 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f46d91a0-8be6-46bc-a325-5b4ccc433a6d-config-data\") pod \"nova-api-0\" (UID: \"f46d91a0-8be6-46bc-a325-5b4ccc433a6d\") " pod="openstack/nova-api-0" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.955252 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f46d91a0-8be6-46bc-a325-5b4ccc433a6d-logs\") pod \"nova-api-0\" (UID: \"f46d91a0-8be6-46bc-a325-5b4ccc433a6d\") " pod="openstack/nova-api-0" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.955295 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h5g4\" (UniqueName: \"kubernetes.io/projected/8e55ab75-534e-4cf9-9a5c-58d5da07ad7b-kube-api-access-2h5g4\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e55ab75-534e-4cf9-9a5c-58d5da07ad7b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.955326 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e55ab75-534e-4cf9-9a5c-58d5da07ad7b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e55ab75-534e-4cf9-9a5c-58d5da07ad7b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.955357 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f46d91a0-8be6-46bc-a325-5b4ccc433a6d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f46d91a0-8be6-46bc-a325-5b4ccc433a6d\") " pod="openstack/nova-api-0" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.959260 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e55ab75-534e-4cf9-9a5c-58d5da07ad7b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e55ab75-534e-4cf9-9a5c-58d5da07ad7b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.961814 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 12:55:17 crc kubenswrapper[4727]: I1001 12:55:17.976131 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e55ab75-534e-4cf9-9a5c-58d5da07ad7b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e55ab75-534e-4cf9-9a5c-58d5da07ad7b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.015782 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h5g4\" (UniqueName: \"kubernetes.io/projected/8e55ab75-534e-4cf9-9a5c-58d5da07ad7b-kube-api-access-2h5g4\") pod \"nova-cell1-novncproxy-0\" (UID: \"8e55ab75-534e-4cf9-9a5c-58d5da07ad7b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.017882 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-mlvdt"] Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.021149 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.026308 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-mlvdt"] Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.059797 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8prs8\" (UniqueName: \"kubernetes.io/projected/b08d82fe-8a21-4226-917b-c98332c161a9-kube-api-access-8prs8\") pod \"nova-metadata-0\" (UID: \"b08d82fe-8a21-4226-917b-c98332c161a9\") " pod="openstack/nova-metadata-0" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.059859 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f46d91a0-8be6-46bc-a325-5b4ccc433a6d-logs\") pod \"nova-api-0\" (UID: \"f46d91a0-8be6-46bc-a325-5b4ccc433a6d\") " pod="openstack/nova-api-0" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.059948 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b08d82fe-8a21-4226-917b-c98332c161a9-config-data\") pod \"nova-metadata-0\" (UID: \"b08d82fe-8a21-4226-917b-c98332c161a9\") " pod="openstack/nova-metadata-0" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.059989 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f46d91a0-8be6-46bc-a325-5b4ccc433a6d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f46d91a0-8be6-46bc-a325-5b4ccc433a6d\") " pod="openstack/nova-api-0" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.060054 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs9hk\" (UniqueName: \"kubernetes.io/projected/f46d91a0-8be6-46bc-a325-5b4ccc433a6d-kube-api-access-cs9hk\") pod \"nova-api-0\" (UID: \"f46d91a0-8be6-46bc-a325-5b4ccc433a6d\") " pod="openstack/nova-api-0" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.060087 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b08d82fe-8a21-4226-917b-c98332c161a9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b08d82fe-8a21-4226-917b-c98332c161a9\") " pod="openstack/nova-metadata-0" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.060111 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f46d91a0-8be6-46bc-a325-5b4ccc433a6d-config-data\") pod \"nova-api-0\" (UID: \"f46d91a0-8be6-46bc-a325-5b4ccc433a6d\") " pod="openstack/nova-api-0" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.060169 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b08d82fe-8a21-4226-917b-c98332c161a9-logs\") pod \"nova-metadata-0\" (UID: \"b08d82fe-8a21-4226-917b-c98332c161a9\") " pod="openstack/nova-metadata-0" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.060629 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f46d91a0-8be6-46bc-a325-5b4ccc433a6d-logs\") pod \"nova-api-0\" (UID: \"f46d91a0-8be6-46bc-a325-5b4ccc433a6d\") " pod="openstack/nova-api-0" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.065072 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f46d91a0-8be6-46bc-a325-5b4ccc433a6d-config-data\") pod \"nova-api-0\" (UID: \"f46d91a0-8be6-46bc-a325-5b4ccc433a6d\") " pod="openstack/nova-api-0" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.067987 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f46d91a0-8be6-46bc-a325-5b4ccc433a6d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f46d91a0-8be6-46bc-a325-5b4ccc433a6d\") " pod="openstack/nova-api-0" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.081867 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs9hk\" (UniqueName: \"kubernetes.io/projected/f46d91a0-8be6-46bc-a325-5b4ccc433a6d-kube-api-access-cs9hk\") pod \"nova-api-0\" (UID: \"f46d91a0-8be6-46bc-a325-5b4ccc433a6d\") " pod="openstack/nova-api-0" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.162638 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-mlvdt\" (UID: \"a62fbedb-de7c-424a-bdec-92639359a708\") " pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.162715 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b08d82fe-8a21-4226-917b-c98332c161a9-logs\") pod \"nova-metadata-0\" (UID: \"b08d82fe-8a21-4226-917b-c98332c161a9\") " pod="openstack/nova-metadata-0" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.162761 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-config\") pod \"dnsmasq-dns-845d6d6f59-mlvdt\" (UID: \"a62fbedb-de7c-424a-bdec-92639359a708\") " pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.162817 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-mlvdt\" (UID: \"a62fbedb-de7c-424a-bdec-92639359a708\") " pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.162988 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8prs8\" (UniqueName: \"kubernetes.io/projected/b08d82fe-8a21-4226-917b-c98332c161a9-kube-api-access-8prs8\") pod \"nova-metadata-0\" (UID: \"b08d82fe-8a21-4226-917b-c98332c161a9\") " pod="openstack/nova-metadata-0" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.163059 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-mlvdt\" (UID: \"a62fbedb-de7c-424a-bdec-92639359a708\") " pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.163138 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-mlvdt\" (UID: \"a62fbedb-de7c-424a-bdec-92639359a708\") " pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.163307 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b08d82fe-8a21-4226-917b-c98332c161a9-config-data\") pod \"nova-metadata-0\" (UID: \"b08d82fe-8a21-4226-917b-c98332c161a9\") " pod="openstack/nova-metadata-0" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.163497 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg57b\" (UniqueName: \"kubernetes.io/projected/a62fbedb-de7c-424a-bdec-92639359a708-kube-api-access-kg57b\") pod \"dnsmasq-dns-845d6d6f59-mlvdt\" (UID: \"a62fbedb-de7c-424a-bdec-92639359a708\") " pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.163597 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b08d82fe-8a21-4226-917b-c98332c161a9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b08d82fe-8a21-4226-917b-c98332c161a9\") " pod="openstack/nova-metadata-0" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.167388 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b08d82fe-8a21-4226-917b-c98332c161a9-logs\") pod \"nova-metadata-0\" (UID: \"b08d82fe-8a21-4226-917b-c98332c161a9\") " pod="openstack/nova-metadata-0" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.179672 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b08d82fe-8a21-4226-917b-c98332c161a9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b08d82fe-8a21-4226-917b-c98332c161a9\") " pod="openstack/nova-metadata-0" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.180093 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b08d82fe-8a21-4226-917b-c98332c161a9-config-data\") pod \"nova-metadata-0\" (UID: \"b08d82fe-8a21-4226-917b-c98332c161a9\") " pod="openstack/nova-metadata-0" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.184222 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8prs8\" (UniqueName: \"kubernetes.io/projected/b08d82fe-8a21-4226-917b-c98332c161a9-kube-api-access-8prs8\") pod \"nova-metadata-0\" (UID: \"b08d82fe-8a21-4226-917b-c98332c161a9\") " pod="openstack/nova-metadata-0" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.265891 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg57b\" (UniqueName: \"kubernetes.io/projected/a62fbedb-de7c-424a-bdec-92639359a708-kube-api-access-kg57b\") pod \"dnsmasq-dns-845d6d6f59-mlvdt\" (UID: \"a62fbedb-de7c-424a-bdec-92639359a708\") " pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.266032 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-mlvdt\" (UID: \"a62fbedb-de7c-424a-bdec-92639359a708\") " pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.266124 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-config\") pod \"dnsmasq-dns-845d6d6f59-mlvdt\" (UID: \"a62fbedb-de7c-424a-bdec-92639359a708\") " pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.266201 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-mlvdt\" (UID: \"a62fbedb-de7c-424a-bdec-92639359a708\") " pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.266274 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-mlvdt\" (UID: \"a62fbedb-de7c-424a-bdec-92639359a708\") " pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.266336 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-mlvdt\" (UID: \"a62fbedb-de7c-424a-bdec-92639359a708\") " pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.268035 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-mlvdt\" (UID: \"a62fbedb-de7c-424a-bdec-92639359a708\") " pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.268646 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-config\") pod \"dnsmasq-dns-845d6d6f59-mlvdt\" (UID: \"a62fbedb-de7c-424a-bdec-92639359a708\") " pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.269354 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-mlvdt\" (UID: \"a62fbedb-de7c-424a-bdec-92639359a708\") " pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.269987 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-mlvdt\" (UID: \"a62fbedb-de7c-424a-bdec-92639359a708\") " pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.271352 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-mlvdt\" (UID: \"a62fbedb-de7c-424a-bdec-92639359a708\") " pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.283130 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg57b\" (UniqueName: \"kubernetes.io/projected/a62fbedb-de7c-424a-bdec-92639359a708-kube-api-access-kg57b\") pod \"dnsmasq-dns-845d6d6f59-mlvdt\" (UID: \"a62fbedb-de7c-424a-bdec-92639359a708\") " pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.312751 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.325042 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.338455 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.350233 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.480563 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kfqzs"] Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.481823 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kfqzs" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.487971 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.488236 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.495738 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-wr7vr"] Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.517237 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kfqzs"] Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.557439 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:55:18 crc kubenswrapper[4727]: W1001 12:55:18.587768 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdce9668_f762_4b0d_b33a_76a1b270c575.slice/crio-7a0ac7af122ed651b01226eb40bce6c1e1c9bb7685dc06317be6eceb0aa141e0 WatchSource:0}: Error finding container 7a0ac7af122ed651b01226eb40bce6c1e1c9bb7685dc06317be6eceb0aa141e0: Status 404 returned error can't find the container with id 7a0ac7af122ed651b01226eb40bce6c1e1c9bb7685dc06317be6eceb0aa141e0 Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.594597 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccd566f7-b3ff-4de8-8ec9-8c080005d70a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-kfqzs\" (UID: \"ccd566f7-b3ff-4de8-8ec9-8c080005d70a\") " pod="openstack/nova-cell1-conductor-db-sync-kfqzs" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.594841 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccd566f7-b3ff-4de8-8ec9-8c080005d70a-scripts\") pod \"nova-cell1-conductor-db-sync-kfqzs\" (UID: \"ccd566f7-b3ff-4de8-8ec9-8c080005d70a\") " pod="openstack/nova-cell1-conductor-db-sync-kfqzs" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.594957 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h47lx\" (UniqueName: \"kubernetes.io/projected/ccd566f7-b3ff-4de8-8ec9-8c080005d70a-kube-api-access-h47lx\") pod \"nova-cell1-conductor-db-sync-kfqzs\" (UID: \"ccd566f7-b3ff-4de8-8ec9-8c080005d70a\") " pod="openstack/nova-cell1-conductor-db-sync-kfqzs" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.595083 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccd566f7-b3ff-4de8-8ec9-8c080005d70a-config-data\") pod \"nova-cell1-conductor-db-sync-kfqzs\" (UID: \"ccd566f7-b3ff-4de8-8ec9-8c080005d70a\") " pod="openstack/nova-cell1-conductor-db-sync-kfqzs" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.718707 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccd566f7-b3ff-4de8-8ec9-8c080005d70a-scripts\") pod \"nova-cell1-conductor-db-sync-kfqzs\" (UID: \"ccd566f7-b3ff-4de8-8ec9-8c080005d70a\") " pod="openstack/nova-cell1-conductor-db-sync-kfqzs" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.719122 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h47lx\" (UniqueName: \"kubernetes.io/projected/ccd566f7-b3ff-4de8-8ec9-8c080005d70a-kube-api-access-h47lx\") pod \"nova-cell1-conductor-db-sync-kfqzs\" (UID: \"ccd566f7-b3ff-4de8-8ec9-8c080005d70a\") " pod="openstack/nova-cell1-conductor-db-sync-kfqzs" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.719229 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccd566f7-b3ff-4de8-8ec9-8c080005d70a-config-data\") pod \"nova-cell1-conductor-db-sync-kfqzs\" (UID: \"ccd566f7-b3ff-4de8-8ec9-8c080005d70a\") " pod="openstack/nova-cell1-conductor-db-sync-kfqzs" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.719399 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccd566f7-b3ff-4de8-8ec9-8c080005d70a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-kfqzs\" (UID: \"ccd566f7-b3ff-4de8-8ec9-8c080005d70a\") " pod="openstack/nova-cell1-conductor-db-sync-kfqzs" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.725284 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccd566f7-b3ff-4de8-8ec9-8c080005d70a-scripts\") pod \"nova-cell1-conductor-db-sync-kfqzs\" (UID: \"ccd566f7-b3ff-4de8-8ec9-8c080005d70a\") " pod="openstack/nova-cell1-conductor-db-sync-kfqzs" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.725434 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccd566f7-b3ff-4de8-8ec9-8c080005d70a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-kfqzs\" (UID: \"ccd566f7-b3ff-4de8-8ec9-8c080005d70a\") " pod="openstack/nova-cell1-conductor-db-sync-kfqzs" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.735325 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccd566f7-b3ff-4de8-8ec9-8c080005d70a-config-data\") pod \"nova-cell1-conductor-db-sync-kfqzs\" (UID: \"ccd566f7-b3ff-4de8-8ec9-8c080005d70a\") " pod="openstack/nova-cell1-conductor-db-sync-kfqzs" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.739890 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wr7vr" event={"ID":"58437054-5624-488d-a672-2eb046c0d09c","Type":"ContainerStarted","Data":"c6ff3a9525bebc3540fb1745aff538c73629207a1f1203bf109fdfb2132de70c"} Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.744502 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bdce9668-f762-4b0d-b33a-76a1b270c575","Type":"ContainerStarted","Data":"7a0ac7af122ed651b01226eb40bce6c1e1c9bb7685dc06317be6eceb0aa141e0"} Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.747112 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h47lx\" (UniqueName: \"kubernetes.io/projected/ccd566f7-b3ff-4de8-8ec9-8c080005d70a-kube-api-access-h47lx\") pod \"nova-cell1-conductor-db-sync-kfqzs\" (UID: \"ccd566f7-b3ff-4de8-8ec9-8c080005d70a\") " pod="openstack/nova-cell1-conductor-db-sync-kfqzs" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.822719 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kfqzs" Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.938415 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:55:18 crc kubenswrapper[4727]: W1001 12:55:18.948916 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf46d91a0_8be6_46bc_a325_5b4ccc433a6d.slice/crio-b43b2076af8dd0e5467155ae61e62f38b812ad0da0378a245aaa17ba4be7789d WatchSource:0}: Error finding container b43b2076af8dd0e5467155ae61e62f38b812ad0da0378a245aaa17ba4be7789d: Status 404 returned error can't find the container with id b43b2076af8dd0e5467155ae61e62f38b812ad0da0378a245aaa17ba4be7789d Oct 01 12:55:18 crc kubenswrapper[4727]: I1001 12:55:18.996123 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 12:55:19 crc kubenswrapper[4727]: W1001 12:55:19.000118 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e55ab75_534e_4cf9_9a5c_58d5da07ad7b.slice/crio-44fdf3e6c0c1dc06d767f6bba2133a535605acadad2d24c7ab3e5e28ae01fc39 WatchSource:0}: Error finding container 44fdf3e6c0c1dc06d767f6bba2133a535605acadad2d24c7ab3e5e28ae01fc39: Status 404 returned error can't find the container with id 44fdf3e6c0c1dc06d767f6bba2133a535605acadad2d24c7ab3e5e28ae01fc39 Oct 01 12:55:19 crc kubenswrapper[4727]: I1001 12:55:19.144621 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:55:19 crc kubenswrapper[4727]: I1001 12:55:19.166818 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-mlvdt"] Oct 01 12:55:19 crc kubenswrapper[4727]: I1001 12:55:19.343115 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kfqzs"] Oct 01 12:55:19 crc kubenswrapper[4727]: I1001 12:55:19.762976 4727 generic.go:334] "Generic (PLEG): container finished" podID="a62fbedb-de7c-424a-bdec-92639359a708" containerID="cfc566c49464a3065c000e25c972aab21cfb832ca7ca1ca7e047a4d1d0511c3f" exitCode=0 Oct 01 12:55:19 crc kubenswrapper[4727]: I1001 12:55:19.763321 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" event={"ID":"a62fbedb-de7c-424a-bdec-92639359a708","Type":"ContainerDied","Data":"cfc566c49464a3065c000e25c972aab21cfb832ca7ca1ca7e047a4d1d0511c3f"} Oct 01 12:55:19 crc kubenswrapper[4727]: I1001 12:55:19.763382 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" event={"ID":"a62fbedb-de7c-424a-bdec-92639359a708","Type":"ContainerStarted","Data":"6dfbaec7e566dc7c292b6881b877229338ecae9ee5047dc1c47e59864e20b7a1"} Oct 01 12:55:19 crc kubenswrapper[4727]: I1001 12:55:19.765976 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b08d82fe-8a21-4226-917b-c98332c161a9","Type":"ContainerStarted","Data":"f08621022cea1f2d44a2f63af66d199944e3091d176644824b6506477bce377a"} Oct 01 12:55:19 crc kubenswrapper[4727]: I1001 12:55:19.774855 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kfqzs" event={"ID":"ccd566f7-b3ff-4de8-8ec9-8c080005d70a","Type":"ContainerStarted","Data":"7507da85d2ce54f81bb9b454e600e5cb7a114181bc5d055bc2b141684875eb8c"} Oct 01 12:55:19 crc kubenswrapper[4727]: I1001 12:55:19.774906 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kfqzs" event={"ID":"ccd566f7-b3ff-4de8-8ec9-8c080005d70a","Type":"ContainerStarted","Data":"7e11a1258751c484960b0814d06767112c35676680e9cc26978e356188b39b98"} Oct 01 12:55:19 crc kubenswrapper[4727]: I1001 12:55:19.789304 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8e55ab75-534e-4cf9-9a5c-58d5da07ad7b","Type":"ContainerStarted","Data":"44fdf3e6c0c1dc06d767f6bba2133a535605acadad2d24c7ab3e5e28ae01fc39"} Oct 01 12:55:19 crc kubenswrapper[4727]: I1001 12:55:19.814554 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f46d91a0-8be6-46bc-a325-5b4ccc433a6d","Type":"ContainerStarted","Data":"b43b2076af8dd0e5467155ae61e62f38b812ad0da0378a245aaa17ba4be7789d"} Oct 01 12:55:19 crc kubenswrapper[4727]: I1001 12:55:19.828989 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-kfqzs" podStartSLOduration=1.8289429419999998 podStartE2EDuration="1.828942942s" podCreationTimestamp="2025-10-01 12:55:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:19.815843142 +0000 UTC m=+1098.137197979" watchObservedRunningTime="2025-10-01 12:55:19.828942942 +0000 UTC m=+1098.150297779" Oct 01 12:55:19 crc kubenswrapper[4727]: I1001 12:55:19.831806 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wr7vr" event={"ID":"58437054-5624-488d-a672-2eb046c0d09c","Type":"ContainerStarted","Data":"76e3eea5716d1ca4c8b469c7d39bfa91104b0e74884abd01413900c8f4e64805"} Oct 01 12:55:19 crc kubenswrapper[4727]: I1001 12:55:19.855525 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-wr7vr" podStartSLOduration=2.855504422 podStartE2EDuration="2.855504422s" podCreationTimestamp="2025-10-01 12:55:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:19.849018229 +0000 UTC m=+1098.170373086" watchObservedRunningTime="2025-10-01 12:55:19.855504422 +0000 UTC m=+1098.176859269" Oct 01 12:55:20 crc kubenswrapper[4727]: I1001 12:55:20.814212 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 01 12:55:21 crc kubenswrapper[4727]: I1001 12:55:21.540175 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:55:21 crc kubenswrapper[4727]: I1001 12:55:21.568931 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 12:55:22 crc kubenswrapper[4727]: I1001 12:55:22.858358 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bdce9668-f762-4b0d-b33a-76a1b270c575","Type":"ContainerStarted","Data":"0b8723d2f029ae92faa1986ae123176ba068aa6360323ae4f35507518e8fecc2"} Oct 01 12:55:22 crc kubenswrapper[4727]: I1001 12:55:22.859706 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8e55ab75-534e-4cf9-9a5c-58d5da07ad7b","Type":"ContainerStarted","Data":"0be4c41c5e99aa80ea425c5ff6095c6bdfafaf786898f113cfddee6d74397b24"} Oct 01 12:55:22 crc kubenswrapper[4727]: I1001 12:55:22.859839 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="8e55ab75-534e-4cf9-9a5c-58d5da07ad7b" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://0be4c41c5e99aa80ea425c5ff6095c6bdfafaf786898f113cfddee6d74397b24" gracePeriod=30 Oct 01 12:55:22 crc kubenswrapper[4727]: I1001 12:55:22.862358 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f46d91a0-8be6-46bc-a325-5b4ccc433a6d","Type":"ContainerStarted","Data":"55da78ffe0a2749bc6f01fb0405759347e1cc7b7b7dbb96c3beca0299291f6cf"} Oct 01 12:55:22 crc kubenswrapper[4727]: I1001 12:55:22.862423 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f46d91a0-8be6-46bc-a325-5b4ccc433a6d","Type":"ContainerStarted","Data":"c169c2b31bac92a0ac5a23c185257e82c227955360c2bbe1c26449b69819cf35"} Oct 01 12:55:22 crc kubenswrapper[4727]: I1001 12:55:22.866160 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" event={"ID":"a62fbedb-de7c-424a-bdec-92639359a708","Type":"ContainerStarted","Data":"dedb8d305ee13ccb866e1f6882d20d74107dd9c5fdcf8048d754c24185f844ed"} Oct 01 12:55:22 crc kubenswrapper[4727]: I1001 12:55:22.866329 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" Oct 01 12:55:22 crc kubenswrapper[4727]: I1001 12:55:22.869240 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b08d82fe-8a21-4226-917b-c98332c161a9","Type":"ContainerStarted","Data":"17789a4033e5f357166aea512a2e3daa47da19dda9754391dfc982b6ec16a72e"} Oct 01 12:55:22 crc kubenswrapper[4727]: I1001 12:55:22.869278 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b08d82fe-8a21-4226-917b-c98332c161a9","Type":"ContainerStarted","Data":"236f043ec37083c98bad63f4f60dbf5d1cad17a6eb9d2ddd7db9db8a4df2aae8"} Oct 01 12:55:22 crc kubenswrapper[4727]: I1001 12:55:22.869356 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b08d82fe-8a21-4226-917b-c98332c161a9" containerName="nova-metadata-log" containerID="cri-o://236f043ec37083c98bad63f4f60dbf5d1cad17a6eb9d2ddd7db9db8a4df2aae8" gracePeriod=30 Oct 01 12:55:22 crc kubenswrapper[4727]: I1001 12:55:22.869389 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b08d82fe-8a21-4226-917b-c98332c161a9" containerName="nova-metadata-metadata" containerID="cri-o://17789a4033e5f357166aea512a2e3daa47da19dda9754391dfc982b6ec16a72e" gracePeriod=30 Oct 01 12:55:22 crc kubenswrapper[4727]: I1001 12:55:22.881764 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.858316479 podStartE2EDuration="5.881749323s" podCreationTimestamp="2025-10-01 12:55:17 +0000 UTC" firstStartedPulling="2025-10-01 12:55:18.629378317 +0000 UTC m=+1096.950733154" lastFinishedPulling="2025-10-01 12:55:21.652811171 +0000 UTC m=+1099.974165998" observedRunningTime="2025-10-01 12:55:22.877167961 +0000 UTC m=+1101.198522808" watchObservedRunningTime="2025-10-01 12:55:22.881749323 +0000 UTC m=+1101.203104160" Oct 01 12:55:22 crc kubenswrapper[4727]: I1001 12:55:22.903604 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.222702563 podStartE2EDuration="5.903585266s" podCreationTimestamp="2025-10-01 12:55:17 +0000 UTC" firstStartedPulling="2025-10-01 12:55:18.972164175 +0000 UTC m=+1097.293519012" lastFinishedPulling="2025-10-01 12:55:21.653046878 +0000 UTC m=+1099.974401715" observedRunningTime="2025-10-01 12:55:22.894040397 +0000 UTC m=+1101.215395274" watchObservedRunningTime="2025-10-01 12:55:22.903585266 +0000 UTC m=+1101.224940123" Oct 01 12:55:22 crc kubenswrapper[4727]: I1001 12:55:22.928384 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.4268382600000002 podStartE2EDuration="5.928307798s" podCreationTimestamp="2025-10-01 12:55:17 +0000 UTC" firstStartedPulling="2025-10-01 12:55:19.15093918 +0000 UTC m=+1097.472294017" lastFinishedPulling="2025-10-01 12:55:21.652408718 +0000 UTC m=+1099.973763555" observedRunningTime="2025-10-01 12:55:22.91620004 +0000 UTC m=+1101.237554877" watchObservedRunningTime="2025-10-01 12:55:22.928307798 +0000 UTC m=+1101.249662645" Oct 01 12:55:22 crc kubenswrapper[4727]: I1001 12:55:22.948209 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" podStartSLOduration=5.9481871 podStartE2EDuration="5.9481871s" podCreationTimestamp="2025-10-01 12:55:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:22.939111585 +0000 UTC m=+1101.260466432" watchObservedRunningTime="2025-10-01 12:55:22.9481871 +0000 UTC m=+1101.269541947" Oct 01 12:55:22 crc kubenswrapper[4727]: I1001 12:55:22.962319 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 01 12:55:22 crc kubenswrapper[4727]: I1001 12:55:22.975415 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.343335853 podStartE2EDuration="5.9753943s" podCreationTimestamp="2025-10-01 12:55:17 +0000 UTC" firstStartedPulling="2025-10-01 12:55:19.020807665 +0000 UTC m=+1097.342162502" lastFinishedPulling="2025-10-01 12:55:21.652866112 +0000 UTC m=+1099.974220949" observedRunningTime="2025-10-01 12:55:22.955467086 +0000 UTC m=+1101.276821923" watchObservedRunningTime="2025-10-01 12:55:22.9753943 +0000 UTC m=+1101.296749137" Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.314545 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.340764 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.340813 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.474818 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.639468 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8prs8\" (UniqueName: \"kubernetes.io/projected/b08d82fe-8a21-4226-917b-c98332c161a9-kube-api-access-8prs8\") pod \"b08d82fe-8a21-4226-917b-c98332c161a9\" (UID: \"b08d82fe-8a21-4226-917b-c98332c161a9\") " Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.640398 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b08d82fe-8a21-4226-917b-c98332c161a9-config-data\") pod \"b08d82fe-8a21-4226-917b-c98332c161a9\" (UID: \"b08d82fe-8a21-4226-917b-c98332c161a9\") " Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.640475 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b08d82fe-8a21-4226-917b-c98332c161a9-combined-ca-bundle\") pod \"b08d82fe-8a21-4226-917b-c98332c161a9\" (UID: \"b08d82fe-8a21-4226-917b-c98332c161a9\") " Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.640754 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b08d82fe-8a21-4226-917b-c98332c161a9-logs\") pod \"b08d82fe-8a21-4226-917b-c98332c161a9\" (UID: \"b08d82fe-8a21-4226-917b-c98332c161a9\") " Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.641108 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b08d82fe-8a21-4226-917b-c98332c161a9-logs" (OuterVolumeSpecName: "logs") pod "b08d82fe-8a21-4226-917b-c98332c161a9" (UID: "b08d82fe-8a21-4226-917b-c98332c161a9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.641286 4727 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b08d82fe-8a21-4226-917b-c98332c161a9-logs\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.645740 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b08d82fe-8a21-4226-917b-c98332c161a9-kube-api-access-8prs8" (OuterVolumeSpecName: "kube-api-access-8prs8") pod "b08d82fe-8a21-4226-917b-c98332c161a9" (UID: "b08d82fe-8a21-4226-917b-c98332c161a9"). InnerVolumeSpecName "kube-api-access-8prs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.670324 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b08d82fe-8a21-4226-917b-c98332c161a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b08d82fe-8a21-4226-917b-c98332c161a9" (UID: "b08d82fe-8a21-4226-917b-c98332c161a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.673205 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b08d82fe-8a21-4226-917b-c98332c161a9-config-data" (OuterVolumeSpecName: "config-data") pod "b08d82fe-8a21-4226-917b-c98332c161a9" (UID: "b08d82fe-8a21-4226-917b-c98332c161a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.742846 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b08d82fe-8a21-4226-917b-c98332c161a9-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.742890 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b08d82fe-8a21-4226-917b-c98332c161a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.742907 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8prs8\" (UniqueName: \"kubernetes.io/projected/b08d82fe-8a21-4226-917b-c98332c161a9-kube-api-access-8prs8\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.886221 4727 generic.go:334] "Generic (PLEG): container finished" podID="b08d82fe-8a21-4226-917b-c98332c161a9" containerID="17789a4033e5f357166aea512a2e3daa47da19dda9754391dfc982b6ec16a72e" exitCode=0 Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.886263 4727 generic.go:334] "Generic (PLEG): container finished" podID="b08d82fe-8a21-4226-917b-c98332c161a9" containerID="236f043ec37083c98bad63f4f60dbf5d1cad17a6eb9d2ddd7db9db8a4df2aae8" exitCode=143 Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.886306 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.886382 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b08d82fe-8a21-4226-917b-c98332c161a9","Type":"ContainerDied","Data":"17789a4033e5f357166aea512a2e3daa47da19dda9754391dfc982b6ec16a72e"} Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.886413 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b08d82fe-8a21-4226-917b-c98332c161a9","Type":"ContainerDied","Data":"236f043ec37083c98bad63f4f60dbf5d1cad17a6eb9d2ddd7db9db8a4df2aae8"} Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.886426 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b08d82fe-8a21-4226-917b-c98332c161a9","Type":"ContainerDied","Data":"f08621022cea1f2d44a2f63af66d199944e3091d176644824b6506477bce377a"} Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.886445 4727 scope.go:117] "RemoveContainer" containerID="17789a4033e5f357166aea512a2e3daa47da19dda9754391dfc982b6ec16a72e" Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.934739 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.943643 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.965415 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:55:23 crc kubenswrapper[4727]: E1001 12:55:23.965840 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b08d82fe-8a21-4226-917b-c98332c161a9" containerName="nova-metadata-metadata" Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.965860 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="b08d82fe-8a21-4226-917b-c98332c161a9" containerName="nova-metadata-metadata" Oct 01 12:55:23 crc kubenswrapper[4727]: E1001 12:55:23.965890 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b08d82fe-8a21-4226-917b-c98332c161a9" containerName="nova-metadata-log" Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.965897 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="b08d82fe-8a21-4226-917b-c98332c161a9" containerName="nova-metadata-log" Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.966103 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="b08d82fe-8a21-4226-917b-c98332c161a9" containerName="nova-metadata-metadata" Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.966122 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="b08d82fe-8a21-4226-917b-c98332c161a9" containerName="nova-metadata-log" Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.967085 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.974185 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.982350 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:55:23 crc kubenswrapper[4727]: I1001 12:55:23.985572 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.097490 4727 scope.go:117] "RemoveContainer" containerID="236f043ec37083c98bad63f4f60dbf5d1cad17a6eb9d2ddd7db9db8a4df2aae8" Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.132900 4727 scope.go:117] "RemoveContainer" containerID="17789a4033e5f357166aea512a2e3daa47da19dda9754391dfc982b6ec16a72e" Oct 01 12:55:24 crc kubenswrapper[4727]: E1001 12:55:24.137131 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17789a4033e5f357166aea512a2e3daa47da19dda9754391dfc982b6ec16a72e\": container with ID starting with 17789a4033e5f357166aea512a2e3daa47da19dda9754391dfc982b6ec16a72e not found: ID does not exist" containerID="17789a4033e5f357166aea512a2e3daa47da19dda9754391dfc982b6ec16a72e" Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.137370 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17789a4033e5f357166aea512a2e3daa47da19dda9754391dfc982b6ec16a72e"} err="failed to get container status \"17789a4033e5f357166aea512a2e3daa47da19dda9754391dfc982b6ec16a72e\": rpc error: code = NotFound desc = could not find container \"17789a4033e5f357166aea512a2e3daa47da19dda9754391dfc982b6ec16a72e\": container with ID starting with 17789a4033e5f357166aea512a2e3daa47da19dda9754391dfc982b6ec16a72e not found: ID does not exist" Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.137477 4727 scope.go:117] "RemoveContainer" containerID="236f043ec37083c98bad63f4f60dbf5d1cad17a6eb9d2ddd7db9db8a4df2aae8" Oct 01 12:55:24 crc kubenswrapper[4727]: E1001 12:55:24.139099 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"236f043ec37083c98bad63f4f60dbf5d1cad17a6eb9d2ddd7db9db8a4df2aae8\": container with ID starting with 236f043ec37083c98bad63f4f60dbf5d1cad17a6eb9d2ddd7db9db8a4df2aae8 not found: ID does not exist" containerID="236f043ec37083c98bad63f4f60dbf5d1cad17a6eb9d2ddd7db9db8a4df2aae8" Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.139212 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"236f043ec37083c98bad63f4f60dbf5d1cad17a6eb9d2ddd7db9db8a4df2aae8"} err="failed to get container status \"236f043ec37083c98bad63f4f60dbf5d1cad17a6eb9d2ddd7db9db8a4df2aae8\": rpc error: code = NotFound desc = could not find container \"236f043ec37083c98bad63f4f60dbf5d1cad17a6eb9d2ddd7db9db8a4df2aae8\": container with ID starting with 236f043ec37083c98bad63f4f60dbf5d1cad17a6eb9d2ddd7db9db8a4df2aae8 not found: ID does not exist" Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.139320 4727 scope.go:117] "RemoveContainer" containerID="17789a4033e5f357166aea512a2e3daa47da19dda9754391dfc982b6ec16a72e" Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.142056 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17789a4033e5f357166aea512a2e3daa47da19dda9754391dfc982b6ec16a72e"} err="failed to get container status \"17789a4033e5f357166aea512a2e3daa47da19dda9754391dfc982b6ec16a72e\": rpc error: code = NotFound desc = could not find container \"17789a4033e5f357166aea512a2e3daa47da19dda9754391dfc982b6ec16a72e\": container with ID starting with 17789a4033e5f357166aea512a2e3daa47da19dda9754391dfc982b6ec16a72e not found: ID does not exist" Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.142180 4727 scope.go:117] "RemoveContainer" containerID="236f043ec37083c98bad63f4f60dbf5d1cad17a6eb9d2ddd7db9db8a4df2aae8" Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.142533 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"236f043ec37083c98bad63f4f60dbf5d1cad17a6eb9d2ddd7db9db8a4df2aae8"} err="failed to get container status \"236f043ec37083c98bad63f4f60dbf5d1cad17a6eb9d2ddd7db9db8a4df2aae8\": rpc error: code = NotFound desc = could not find container \"236f043ec37083c98bad63f4f60dbf5d1cad17a6eb9d2ddd7db9db8a4df2aae8\": container with ID starting with 236f043ec37083c98bad63f4f60dbf5d1cad17a6eb9d2ddd7db9db8a4df2aae8 not found: ID does not exist" Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.167591 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0a6cebd0-4027-4eba-b6fc-67f3b9aea252\") " pod="openstack/nova-metadata-0" Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.167666 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0a6cebd0-4027-4eba-b6fc-67f3b9aea252\") " pod="openstack/nova-metadata-0" Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.167727 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2csh\" (UniqueName: \"kubernetes.io/projected/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-kube-api-access-n2csh\") pod \"nova-metadata-0\" (UID: \"0a6cebd0-4027-4eba-b6fc-67f3b9aea252\") " pod="openstack/nova-metadata-0" Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.167773 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-config-data\") pod \"nova-metadata-0\" (UID: \"0a6cebd0-4027-4eba-b6fc-67f3b9aea252\") " pod="openstack/nova-metadata-0" Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.167798 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-logs\") pod \"nova-metadata-0\" (UID: \"0a6cebd0-4027-4eba-b6fc-67f3b9aea252\") " pod="openstack/nova-metadata-0" Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.269580 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0a6cebd0-4027-4eba-b6fc-67f3b9aea252\") " pod="openstack/nova-metadata-0" Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.269973 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2csh\" (UniqueName: \"kubernetes.io/projected/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-kube-api-access-n2csh\") pod \"nova-metadata-0\" (UID: \"0a6cebd0-4027-4eba-b6fc-67f3b9aea252\") " pod="openstack/nova-metadata-0" Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.270204 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-config-data\") pod \"nova-metadata-0\" (UID: \"0a6cebd0-4027-4eba-b6fc-67f3b9aea252\") " pod="openstack/nova-metadata-0" Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.270358 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-logs\") pod \"nova-metadata-0\" (UID: \"0a6cebd0-4027-4eba-b6fc-67f3b9aea252\") " pod="openstack/nova-metadata-0" Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.270748 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0a6cebd0-4027-4eba-b6fc-67f3b9aea252\") " pod="openstack/nova-metadata-0" Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.271104 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-logs\") pod \"nova-metadata-0\" (UID: \"0a6cebd0-4027-4eba-b6fc-67f3b9aea252\") " pod="openstack/nova-metadata-0" Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.275344 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0a6cebd0-4027-4eba-b6fc-67f3b9aea252\") " pod="openstack/nova-metadata-0" Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.275430 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-config-data\") pod \"nova-metadata-0\" (UID: \"0a6cebd0-4027-4eba-b6fc-67f3b9aea252\") " pod="openstack/nova-metadata-0" Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.277545 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0a6cebd0-4027-4eba-b6fc-67f3b9aea252\") " pod="openstack/nova-metadata-0" Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.291467 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2csh\" (UniqueName: \"kubernetes.io/projected/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-kube-api-access-n2csh\") pod \"nova-metadata-0\" (UID: \"0a6cebd0-4027-4eba-b6fc-67f3b9aea252\") " pod="openstack/nova-metadata-0" Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.370555 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.388987 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b08d82fe-8a21-4226-917b-c98332c161a9" path="/var/lib/kubelet/pods/b08d82fe-8a21-4226-917b-c98332c161a9/volumes" Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.815048 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:55:24 crc kubenswrapper[4727]: I1001 12:55:24.897055 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0a6cebd0-4027-4eba-b6fc-67f3b9aea252","Type":"ContainerStarted","Data":"6971b5c7b7196364770a76040efdac53227d574e3d0ea9bf5b29c6d09df61da8"} Oct 01 12:55:25 crc kubenswrapper[4727]: I1001 12:55:25.911372 4727 generic.go:334] "Generic (PLEG): container finished" podID="86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" containerID="6a35062585bd1e7d212a0a0d46d459c5729a57996ba20acbb7c35cd6a5423340" exitCode=137 Oct 01 12:55:25 crc kubenswrapper[4727]: I1001 12:55:25.911475 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee","Type":"ContainerDied","Data":"6a35062585bd1e7d212a0a0d46d459c5729a57996ba20acbb7c35cd6a5423340"} Oct 01 12:55:25 crc kubenswrapper[4727]: I1001 12:55:25.911857 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee","Type":"ContainerDied","Data":"6b56ce5b508d747114ea6b9d2336df5be9b9bcc9a94ee928c7205cebc2b36bb0"} Oct 01 12:55:25 crc kubenswrapper[4727]: I1001 12:55:25.911876 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b56ce5b508d747114ea6b9d2336df5be9b9bcc9a94ee928c7205cebc2b36bb0" Oct 01 12:55:25 crc kubenswrapper[4727]: I1001 12:55:25.913194 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0a6cebd0-4027-4eba-b6fc-67f3b9aea252","Type":"ContainerStarted","Data":"8a16e21b9eefa0f2d0b4fc305c712bd2bf71b79a66de26fa298e9642dabc6128"} Oct 01 12:55:25 crc kubenswrapper[4727]: I1001 12:55:25.913236 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0a6cebd0-4027-4eba-b6fc-67f3b9aea252","Type":"ContainerStarted","Data":"268951fd34f89f1cd017eda943d0e124934917d45fd16597439dbe68de2f71c1"} Oct 01 12:55:25 crc kubenswrapper[4727]: I1001 12:55:25.920927 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:55:25 crc kubenswrapper[4727]: I1001 12:55:25.936905 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.936888756 podStartE2EDuration="2.936888756s" podCreationTimestamp="2025-10-01 12:55:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:25.929044939 +0000 UTC m=+1104.250399776" watchObservedRunningTime="2025-10-01 12:55:25.936888756 +0000 UTC m=+1104.258243593" Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.016730 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-scripts\") pod \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.016875 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-combined-ca-bundle\") pod \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.016936 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-config-data\") pod \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.017069 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-log-httpd\") pod \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.017129 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-run-httpd\") pod \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.017146 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58qnz\" (UniqueName: \"kubernetes.io/projected/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-kube-api-access-58qnz\") pod \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.017173 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-sg-core-conf-yaml\") pod \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\" (UID: \"86ad6e56-25b1-4c21-b2ed-33f4f64e61ee\") " Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.019130 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" (UID: "86ad6e56-25b1-4c21-b2ed-33f4f64e61ee"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.019203 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" (UID: "86ad6e56-25b1-4c21-b2ed-33f4f64e61ee"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.036312 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-scripts" (OuterVolumeSpecName: "scripts") pod "86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" (UID: "86ad6e56-25b1-4c21-b2ed-33f4f64e61ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.040454 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-kube-api-access-58qnz" (OuterVolumeSpecName: "kube-api-access-58qnz") pod "86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" (UID: "86ad6e56-25b1-4c21-b2ed-33f4f64e61ee"). InnerVolumeSpecName "kube-api-access-58qnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.048206 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" (UID: "86ad6e56-25b1-4c21-b2ed-33f4f64e61ee"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.093351 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" (UID: "86ad6e56-25b1-4c21-b2ed-33f4f64e61ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.115027 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-config-data" (OuterVolumeSpecName: "config-data") pod "86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" (UID: "86ad6e56-25b1-4c21-b2ed-33f4f64e61ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.120097 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.120137 4727 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.120147 4727 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.120159 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58qnz\" (UniqueName: \"kubernetes.io/projected/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-kube-api-access-58qnz\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.120170 4727 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.120178 4727 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.120185 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.921654 4727 generic.go:334] "Generic (PLEG): container finished" podID="58437054-5624-488d-a672-2eb046c0d09c" containerID="76e3eea5716d1ca4c8b469c7d39bfa91104b0e74884abd01413900c8f4e64805" exitCode=0 Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.921753 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wr7vr" event={"ID":"58437054-5624-488d-a672-2eb046c0d09c","Type":"ContainerDied","Data":"76e3eea5716d1ca4c8b469c7d39bfa91104b0e74884abd01413900c8f4e64805"} Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.922078 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.963711 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.972424 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.997390 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:55:26 crc kubenswrapper[4727]: E1001 12:55:26.998142 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" containerName="proxy-httpd" Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.998268 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" containerName="proxy-httpd" Oct 01 12:55:26 crc kubenswrapper[4727]: E1001 12:55:26.998377 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" containerName="ceilometer-notification-agent" Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.998448 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" containerName="ceilometer-notification-agent" Oct 01 12:55:26 crc kubenswrapper[4727]: E1001 12:55:26.998527 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" containerName="ceilometer-central-agent" Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.998595 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" containerName="ceilometer-central-agent" Oct 01 12:55:26 crc kubenswrapper[4727]: E1001 12:55:26.998672 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" containerName="sg-core" Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.998749 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" containerName="sg-core" Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.999165 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" containerName="sg-core" Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.999271 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" containerName="ceilometer-central-agent" Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.999366 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" containerName="proxy-httpd" Oct 01 12:55:26 crc kubenswrapper[4727]: I1001 12:55:26.999449 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" containerName="ceilometer-notification-agent" Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.004477 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.007933 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.008504 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.010605 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.039069 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27b755da-d064-4481-b856-4b51bb15cecb-run-httpd\") pod \"ceilometer-0\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " pod="openstack/ceilometer-0" Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.039300 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b755da-d064-4481-b856-4b51bb15cecb-config-data\") pod \"ceilometer-0\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " pod="openstack/ceilometer-0" Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.039466 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twkgw\" (UniqueName: \"kubernetes.io/projected/27b755da-d064-4481-b856-4b51bb15cecb-kube-api-access-twkgw\") pod \"ceilometer-0\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " pod="openstack/ceilometer-0" Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.039574 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27b755da-d064-4481-b856-4b51bb15cecb-scripts\") pod \"ceilometer-0\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " pod="openstack/ceilometer-0" Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.039655 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27b755da-d064-4481-b856-4b51bb15cecb-log-httpd\") pod \"ceilometer-0\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " pod="openstack/ceilometer-0" Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.039736 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27b755da-d064-4481-b856-4b51bb15cecb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " pod="openstack/ceilometer-0" Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.039826 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b755da-d064-4481-b856-4b51bb15cecb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " pod="openstack/ceilometer-0" Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.140432 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b755da-d064-4481-b856-4b51bb15cecb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " pod="openstack/ceilometer-0" Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.140506 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27b755da-d064-4481-b856-4b51bb15cecb-run-httpd\") pod \"ceilometer-0\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " pod="openstack/ceilometer-0" Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.140523 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b755da-d064-4481-b856-4b51bb15cecb-config-data\") pod \"ceilometer-0\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " pod="openstack/ceilometer-0" Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.140585 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twkgw\" (UniqueName: \"kubernetes.io/projected/27b755da-d064-4481-b856-4b51bb15cecb-kube-api-access-twkgw\") pod \"ceilometer-0\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " pod="openstack/ceilometer-0" Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.140617 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27b755da-d064-4481-b856-4b51bb15cecb-scripts\") pod \"ceilometer-0\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " pod="openstack/ceilometer-0" Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.140645 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27b755da-d064-4481-b856-4b51bb15cecb-log-httpd\") pod \"ceilometer-0\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " pod="openstack/ceilometer-0" Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.140664 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27b755da-d064-4481-b856-4b51bb15cecb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " pod="openstack/ceilometer-0" Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.141274 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27b755da-d064-4481-b856-4b51bb15cecb-log-httpd\") pod \"ceilometer-0\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " pod="openstack/ceilometer-0" Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.141347 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27b755da-d064-4481-b856-4b51bb15cecb-run-httpd\") pod \"ceilometer-0\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " pod="openstack/ceilometer-0" Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.146059 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27b755da-d064-4481-b856-4b51bb15cecb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " pod="openstack/ceilometer-0" Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.147873 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b755da-d064-4481-b856-4b51bb15cecb-config-data\") pod \"ceilometer-0\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " pod="openstack/ceilometer-0" Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.148535 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b755da-d064-4481-b856-4b51bb15cecb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " pod="openstack/ceilometer-0" Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.155440 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27b755da-d064-4481-b856-4b51bb15cecb-scripts\") pod \"ceilometer-0\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " pod="openstack/ceilometer-0" Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.157097 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twkgw\" (UniqueName: \"kubernetes.io/projected/27b755da-d064-4481-b856-4b51bb15cecb-kube-api-access-twkgw\") pod \"ceilometer-0\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " pod="openstack/ceilometer-0" Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.325179 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.741833 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.935619 4727 generic.go:334] "Generic (PLEG): container finished" podID="ccd566f7-b3ff-4de8-8ec9-8c080005d70a" containerID="7507da85d2ce54f81bb9b454e600e5cb7a114181bc5d055bc2b141684875eb8c" exitCode=0 Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.935696 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kfqzs" event={"ID":"ccd566f7-b3ff-4de8-8ec9-8c080005d70a","Type":"ContainerDied","Data":"7507da85d2ce54f81bb9b454e600e5cb7a114181bc5d055bc2b141684875eb8c"} Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.937328 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27b755da-d064-4481-b856-4b51bb15cecb","Type":"ContainerStarted","Data":"ab26382c2d1a3ca68c36b50aeb8656829916d467a5c3db235041143e82fda2c6"} Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.963216 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 01 12:55:27 crc kubenswrapper[4727]: I1001 12:55:27.993512 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.325825 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.326239 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.353160 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.385364 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86ad6e56-25b1-4c21-b2ed-33f4f64e61ee" path="/var/lib/kubelet/pods/86ad6e56-25b1-4c21-b2ed-33f4f64e61ee/volumes" Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.424624 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-thhck"] Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.424895 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-thhck" podUID="dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0" containerName="dnsmasq-dns" containerID="cri-o://759006ab7302535a7223715dd8e5b23bbc8d0cd4ded5a1c343658428088b08a8" gracePeriod=10 Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.495061 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wr7vr" Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.573179 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58437054-5624-488d-a672-2eb046c0d09c-combined-ca-bundle\") pod \"58437054-5624-488d-a672-2eb046c0d09c\" (UID: \"58437054-5624-488d-a672-2eb046c0d09c\") " Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.573370 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58437054-5624-488d-a672-2eb046c0d09c-scripts\") pod \"58437054-5624-488d-a672-2eb046c0d09c\" (UID: \"58437054-5624-488d-a672-2eb046c0d09c\") " Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.573478 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6v9v\" (UniqueName: \"kubernetes.io/projected/58437054-5624-488d-a672-2eb046c0d09c-kube-api-access-v6v9v\") pod \"58437054-5624-488d-a672-2eb046c0d09c\" (UID: \"58437054-5624-488d-a672-2eb046c0d09c\") " Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.573642 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58437054-5624-488d-a672-2eb046c0d09c-config-data\") pod \"58437054-5624-488d-a672-2eb046c0d09c\" (UID: \"58437054-5624-488d-a672-2eb046c0d09c\") " Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.586410 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58437054-5624-488d-a672-2eb046c0d09c-kube-api-access-v6v9v" (OuterVolumeSpecName: "kube-api-access-v6v9v") pod "58437054-5624-488d-a672-2eb046c0d09c" (UID: "58437054-5624-488d-a672-2eb046c0d09c"). InnerVolumeSpecName "kube-api-access-v6v9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.589210 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58437054-5624-488d-a672-2eb046c0d09c-scripts" (OuterVolumeSpecName: "scripts") pod "58437054-5624-488d-a672-2eb046c0d09c" (UID: "58437054-5624-488d-a672-2eb046c0d09c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.611208 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58437054-5624-488d-a672-2eb046c0d09c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58437054-5624-488d-a672-2eb046c0d09c" (UID: "58437054-5624-488d-a672-2eb046c0d09c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.611254 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58437054-5624-488d-a672-2eb046c0d09c-config-data" (OuterVolumeSpecName: "config-data") pod "58437054-5624-488d-a672-2eb046c0d09c" (UID: "58437054-5624-488d-a672-2eb046c0d09c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.675956 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6v9v\" (UniqueName: \"kubernetes.io/projected/58437054-5624-488d-a672-2eb046c0d09c-kube-api-access-v6v9v\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.676233 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58437054-5624-488d-a672-2eb046c0d09c-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.676243 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58437054-5624-488d-a672-2eb046c0d09c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.676251 4727 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58437054-5624-488d-a672-2eb046c0d09c-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.948738 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-wr7vr" event={"ID":"58437054-5624-488d-a672-2eb046c0d09c","Type":"ContainerDied","Data":"c6ff3a9525bebc3540fb1745aff538c73629207a1f1203bf109fdfb2132de70c"} Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.948783 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6ff3a9525bebc3540fb1745aff538c73629207a1f1203bf109fdfb2132de70c" Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.948839 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-wr7vr" Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.964226 4727 generic.go:334] "Generic (PLEG): container finished" podID="dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0" containerID="759006ab7302535a7223715dd8e5b23bbc8d0cd4ded5a1c343658428088b08a8" exitCode=0 Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.964333 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-thhck" event={"ID":"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0","Type":"ContainerDied","Data":"759006ab7302535a7223715dd8e5b23bbc8d0cd4ded5a1c343658428088b08a8"} Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.964398 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-thhck" event={"ID":"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0","Type":"ContainerDied","Data":"e3215d1dd88a5196137e727b9e2db015b4325c899f36b4175fe83b5a6b244d14"} Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.964412 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3215d1dd88a5196137e727b9e2db015b4325c899f36b4175fe83b5a6b244d14" Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.969577 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27b755da-d064-4481-b856-4b51bb15cecb","Type":"ContainerStarted","Data":"c4ffa85c04b17609481b4ceab6673b2e6df719e0fdfedd586410b36416c5e0cc"} Oct 01 12:55:28 crc kubenswrapper[4727]: I1001 12:55:28.998722 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-thhck" Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.026107 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.080410 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-dns-svc\") pod \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\" (UID: \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\") " Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.080522 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-dns-swift-storage-0\") pod \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\" (UID: \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\") " Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.080544 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-ovsdbserver-sb\") pod \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\" (UID: \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\") " Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.080568 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-ovsdbserver-nb\") pod \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\" (UID: \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\") " Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.080619 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-config\") pod \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\" (UID: \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\") " Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.080638 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbgwh\" (UniqueName: \"kubernetes.io/projected/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-kube-api-access-qbgwh\") pod \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\" (UID: \"dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0\") " Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.094638 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-kube-api-access-qbgwh" (OuterVolumeSpecName: "kube-api-access-qbgwh") pod "dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0" (UID: "dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0"). InnerVolumeSpecName "kube-api-access-qbgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.143906 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.144113 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f46d91a0-8be6-46bc-a325-5b4ccc433a6d" containerName="nova-api-log" containerID="cri-o://c169c2b31bac92a0ac5a23c185257e82c227955360c2bbe1c26449b69819cf35" gracePeriod=30 Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.144469 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f46d91a0-8be6-46bc-a325-5b4ccc433a6d" containerName="nova-api-api" containerID="cri-o://55da78ffe0a2749bc6f01fb0405759347e1cc7b7b7dbb96c3beca0299291f6cf" gracePeriod=30 Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.154940 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f46d91a0-8be6-46bc-a325-5b4ccc433a6d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": EOF" Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.156378 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f46d91a0-8be6-46bc-a325-5b4ccc433a6d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": EOF" Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.184727 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbgwh\" (UniqueName: \"kubernetes.io/projected/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-kube-api-access-qbgwh\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.185496 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.185759 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0a6cebd0-4027-4eba-b6fc-67f3b9aea252" containerName="nova-metadata-log" containerID="cri-o://268951fd34f89f1cd017eda943d0e124934917d45fd16597439dbe68de2f71c1" gracePeriod=30 Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.186269 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0a6cebd0-4027-4eba-b6fc-67f3b9aea252" containerName="nova-metadata-metadata" containerID="cri-o://8a16e21b9eefa0f2d0b4fc305c712bd2bf71b79a66de26fa298e9642dabc6128" gracePeriod=30 Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.239671 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0" (UID: "dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.240190 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0" (UID: "dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.249195 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-config" (OuterVolumeSpecName: "config") pod "dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0" (UID: "dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.252833 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0" (UID: "dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.299319 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0" (UID: "dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.299976 4727 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.299992 4727 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.300024 4727 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.300033 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.300042 4727 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.358701 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kfqzs" Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.371688 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.371747 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.504866 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h47lx\" (UniqueName: \"kubernetes.io/projected/ccd566f7-b3ff-4de8-8ec9-8c080005d70a-kube-api-access-h47lx\") pod \"ccd566f7-b3ff-4de8-8ec9-8c080005d70a\" (UID: \"ccd566f7-b3ff-4de8-8ec9-8c080005d70a\") " Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.505108 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccd566f7-b3ff-4de8-8ec9-8c080005d70a-combined-ca-bundle\") pod \"ccd566f7-b3ff-4de8-8ec9-8c080005d70a\" (UID: \"ccd566f7-b3ff-4de8-8ec9-8c080005d70a\") " Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.505651 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccd566f7-b3ff-4de8-8ec9-8c080005d70a-scripts\") pod \"ccd566f7-b3ff-4de8-8ec9-8c080005d70a\" (UID: \"ccd566f7-b3ff-4de8-8ec9-8c080005d70a\") " Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.505743 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccd566f7-b3ff-4de8-8ec9-8c080005d70a-config-data\") pod \"ccd566f7-b3ff-4de8-8ec9-8c080005d70a\" (UID: \"ccd566f7-b3ff-4de8-8ec9-8c080005d70a\") " Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.513778 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccd566f7-b3ff-4de8-8ec9-8c080005d70a-scripts" (OuterVolumeSpecName: "scripts") pod "ccd566f7-b3ff-4de8-8ec9-8c080005d70a" (UID: "ccd566f7-b3ff-4de8-8ec9-8c080005d70a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.513799 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccd566f7-b3ff-4de8-8ec9-8c080005d70a-kube-api-access-h47lx" (OuterVolumeSpecName: "kube-api-access-h47lx") pod "ccd566f7-b3ff-4de8-8ec9-8c080005d70a" (UID: "ccd566f7-b3ff-4de8-8ec9-8c080005d70a"). InnerVolumeSpecName "kube-api-access-h47lx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.548917 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccd566f7-b3ff-4de8-8ec9-8c080005d70a-config-data" (OuterVolumeSpecName: "config-data") pod "ccd566f7-b3ff-4de8-8ec9-8c080005d70a" (UID: "ccd566f7-b3ff-4de8-8ec9-8c080005d70a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.566463 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccd566f7-b3ff-4de8-8ec9-8c080005d70a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ccd566f7-b3ff-4de8-8ec9-8c080005d70a" (UID: "ccd566f7-b3ff-4de8-8ec9-8c080005d70a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.607885 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccd566f7-b3ff-4de8-8ec9-8c080005d70a-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.607931 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h47lx\" (UniqueName: \"kubernetes.io/projected/ccd566f7-b3ff-4de8-8ec9-8c080005d70a-kube-api-access-h47lx\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.607943 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccd566f7-b3ff-4de8-8ec9-8c080005d70a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.607951 4727 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccd566f7-b3ff-4de8-8ec9-8c080005d70a-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:29 crc kubenswrapper[4727]: I1001 12:55:29.629768 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.014859 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27b755da-d064-4481-b856-4b51bb15cecb","Type":"ContainerStarted","Data":"184d11bd35d7572dbf356a155a2801c9a8efcfee8103e175b674d63c5a20b094"} Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.037752 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.052249 4727 generic.go:334] "Generic (PLEG): container finished" podID="f46d91a0-8be6-46bc-a325-5b4ccc433a6d" containerID="c169c2b31bac92a0ac5a23c185257e82c227955360c2bbe1c26449b69819cf35" exitCode=143 Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.052417 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f46d91a0-8be6-46bc-a325-5b4ccc433a6d","Type":"ContainerDied","Data":"c169c2b31bac92a0ac5a23c185257e82c227955360c2bbe1c26449b69819cf35"} Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.102661 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-kfqzs" event={"ID":"ccd566f7-b3ff-4de8-8ec9-8c080005d70a","Type":"ContainerDied","Data":"7e11a1258751c484960b0814d06767112c35676680e9cc26978e356188b39b98"} Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.102702 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e11a1258751c484960b0814d06767112c35676680e9cc26978e356188b39b98" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.102813 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-kfqzs" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.103929 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 01 12:55:30 crc kubenswrapper[4727]: E1001 12:55:30.104554 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58437054-5624-488d-a672-2eb046c0d09c" containerName="nova-manage" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.104580 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="58437054-5624-488d-a672-2eb046c0d09c" containerName="nova-manage" Oct 01 12:55:30 crc kubenswrapper[4727]: E1001 12:55:30.104606 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0" containerName="init" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.104614 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0" containerName="init" Oct 01 12:55:30 crc kubenswrapper[4727]: E1001 12:55:30.104634 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0" containerName="dnsmasq-dns" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.104641 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0" containerName="dnsmasq-dns" Oct 01 12:55:30 crc kubenswrapper[4727]: E1001 12:55:30.104664 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a6cebd0-4027-4eba-b6fc-67f3b9aea252" containerName="nova-metadata-log" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.104672 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a6cebd0-4027-4eba-b6fc-67f3b9aea252" containerName="nova-metadata-log" Oct 01 12:55:30 crc kubenswrapper[4727]: E1001 12:55:30.104705 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccd566f7-b3ff-4de8-8ec9-8c080005d70a" containerName="nova-cell1-conductor-db-sync" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.104711 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd566f7-b3ff-4de8-8ec9-8c080005d70a" containerName="nova-cell1-conductor-db-sync" Oct 01 12:55:30 crc kubenswrapper[4727]: E1001 12:55:30.104725 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a6cebd0-4027-4eba-b6fc-67f3b9aea252" containerName="nova-metadata-metadata" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.104731 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a6cebd0-4027-4eba-b6fc-67f3b9aea252" containerName="nova-metadata-metadata" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.104898 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a6cebd0-4027-4eba-b6fc-67f3b9aea252" containerName="nova-metadata-metadata" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.104912 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a6cebd0-4027-4eba-b6fc-67f3b9aea252" containerName="nova-metadata-log" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.104932 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0" containerName="dnsmasq-dns" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.104941 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="58437054-5624-488d-a672-2eb046c0d09c" containerName="nova-manage" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.104950 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccd566f7-b3ff-4de8-8ec9-8c080005d70a" containerName="nova-cell1-conductor-db-sync" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.107491 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.112287 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.112671 4727 generic.go:334] "Generic (PLEG): container finished" podID="0a6cebd0-4027-4eba-b6fc-67f3b9aea252" containerID="8a16e21b9eefa0f2d0b4fc305c712bd2bf71b79a66de26fa298e9642dabc6128" exitCode=0 Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.112696 4727 generic.go:334] "Generic (PLEG): container finished" podID="0a6cebd0-4027-4eba-b6fc-67f3b9aea252" containerID="268951fd34f89f1cd017eda943d0e124934917d45fd16597439dbe68de2f71c1" exitCode=143 Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.113241 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.113318 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0a6cebd0-4027-4eba-b6fc-67f3b9aea252","Type":"ContainerDied","Data":"8a16e21b9eefa0f2d0b4fc305c712bd2bf71b79a66de26fa298e9642dabc6128"} Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.113394 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0a6cebd0-4027-4eba-b6fc-67f3b9aea252","Type":"ContainerDied","Data":"268951fd34f89f1cd017eda943d0e124934917d45fd16597439dbe68de2f71c1"} Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.113409 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0a6cebd0-4027-4eba-b6fc-67f3b9aea252","Type":"ContainerDied","Data":"6971b5c7b7196364770a76040efdac53227d574e3d0ea9bf5b29c6d09df61da8"} Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.113427 4727 scope.go:117] "RemoveContainer" containerID="8a16e21b9eefa0f2d0b4fc305c712bd2bf71b79a66de26fa298e9642dabc6128" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.113599 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-thhck" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.120239 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.130398 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwbv5\" (UniqueName: \"kubernetes.io/projected/913743df-f049-4011-bbff-2d7abf043bf3-kube-api-access-mwbv5\") pod \"nova-cell1-conductor-0\" (UID: \"913743df-f049-4011-bbff-2d7abf043bf3\") " pod="openstack/nova-cell1-conductor-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.130483 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/913743df-f049-4011-bbff-2d7abf043bf3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"913743df-f049-4011-bbff-2d7abf043bf3\") " pod="openstack/nova-cell1-conductor-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.130584 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/913743df-f049-4011-bbff-2d7abf043bf3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"913743df-f049-4011-bbff-2d7abf043bf3\") " pod="openstack/nova-cell1-conductor-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.178055 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-thhck"] Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.179266 4727 scope.go:117] "RemoveContainer" containerID="268951fd34f89f1cd017eda943d0e124934917d45fd16597439dbe68de2f71c1" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.186044 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-thhck"] Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.212732 4727 scope.go:117] "RemoveContainer" containerID="8a16e21b9eefa0f2d0b4fc305c712bd2bf71b79a66de26fa298e9642dabc6128" Oct 01 12:55:30 crc kubenswrapper[4727]: E1001 12:55:30.213446 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a16e21b9eefa0f2d0b4fc305c712bd2bf71b79a66de26fa298e9642dabc6128\": container with ID starting with 8a16e21b9eefa0f2d0b4fc305c712bd2bf71b79a66de26fa298e9642dabc6128 not found: ID does not exist" containerID="8a16e21b9eefa0f2d0b4fc305c712bd2bf71b79a66de26fa298e9642dabc6128" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.213475 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a16e21b9eefa0f2d0b4fc305c712bd2bf71b79a66de26fa298e9642dabc6128"} err="failed to get container status \"8a16e21b9eefa0f2d0b4fc305c712bd2bf71b79a66de26fa298e9642dabc6128\": rpc error: code = NotFound desc = could not find container \"8a16e21b9eefa0f2d0b4fc305c712bd2bf71b79a66de26fa298e9642dabc6128\": container with ID starting with 8a16e21b9eefa0f2d0b4fc305c712bd2bf71b79a66de26fa298e9642dabc6128 not found: ID does not exist" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.213497 4727 scope.go:117] "RemoveContainer" containerID="268951fd34f89f1cd017eda943d0e124934917d45fd16597439dbe68de2f71c1" Oct 01 12:55:30 crc kubenswrapper[4727]: E1001 12:55:30.213714 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"268951fd34f89f1cd017eda943d0e124934917d45fd16597439dbe68de2f71c1\": container with ID starting with 268951fd34f89f1cd017eda943d0e124934917d45fd16597439dbe68de2f71c1 not found: ID does not exist" containerID="268951fd34f89f1cd017eda943d0e124934917d45fd16597439dbe68de2f71c1" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.213759 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"268951fd34f89f1cd017eda943d0e124934917d45fd16597439dbe68de2f71c1"} err="failed to get container status \"268951fd34f89f1cd017eda943d0e124934917d45fd16597439dbe68de2f71c1\": rpc error: code = NotFound desc = could not find container \"268951fd34f89f1cd017eda943d0e124934917d45fd16597439dbe68de2f71c1\": container with ID starting with 268951fd34f89f1cd017eda943d0e124934917d45fd16597439dbe68de2f71c1 not found: ID does not exist" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.213790 4727 scope.go:117] "RemoveContainer" containerID="8a16e21b9eefa0f2d0b4fc305c712bd2bf71b79a66de26fa298e9642dabc6128" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.214163 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a16e21b9eefa0f2d0b4fc305c712bd2bf71b79a66de26fa298e9642dabc6128"} err="failed to get container status \"8a16e21b9eefa0f2d0b4fc305c712bd2bf71b79a66de26fa298e9642dabc6128\": rpc error: code = NotFound desc = could not find container \"8a16e21b9eefa0f2d0b4fc305c712bd2bf71b79a66de26fa298e9642dabc6128\": container with ID starting with 8a16e21b9eefa0f2d0b4fc305c712bd2bf71b79a66de26fa298e9642dabc6128 not found: ID does not exist" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.214189 4727 scope.go:117] "RemoveContainer" containerID="268951fd34f89f1cd017eda943d0e124934917d45fd16597439dbe68de2f71c1" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.215135 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"268951fd34f89f1cd017eda943d0e124934917d45fd16597439dbe68de2f71c1"} err="failed to get container status \"268951fd34f89f1cd017eda943d0e124934917d45fd16597439dbe68de2f71c1\": rpc error: code = NotFound desc = could not find container \"268951fd34f89f1cd017eda943d0e124934917d45fd16597439dbe68de2f71c1\": container with ID starting with 268951fd34f89f1cd017eda943d0e124934917d45fd16597439dbe68de2f71c1 not found: ID does not exist" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.231289 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-combined-ca-bundle\") pod \"0a6cebd0-4027-4eba-b6fc-67f3b9aea252\" (UID: \"0a6cebd0-4027-4eba-b6fc-67f3b9aea252\") " Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.231430 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-nova-metadata-tls-certs\") pod \"0a6cebd0-4027-4eba-b6fc-67f3b9aea252\" (UID: \"0a6cebd0-4027-4eba-b6fc-67f3b9aea252\") " Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.231583 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-config-data\") pod \"0a6cebd0-4027-4eba-b6fc-67f3b9aea252\" (UID: \"0a6cebd0-4027-4eba-b6fc-67f3b9aea252\") " Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.231602 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-logs\") pod \"0a6cebd0-4027-4eba-b6fc-67f3b9aea252\" (UID: \"0a6cebd0-4027-4eba-b6fc-67f3b9aea252\") " Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.231640 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2csh\" (UniqueName: \"kubernetes.io/projected/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-kube-api-access-n2csh\") pod \"0a6cebd0-4027-4eba-b6fc-67f3b9aea252\" (UID: \"0a6cebd0-4027-4eba-b6fc-67f3b9aea252\") " Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.231951 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwbv5\" (UniqueName: \"kubernetes.io/projected/913743df-f049-4011-bbff-2d7abf043bf3-kube-api-access-mwbv5\") pod \"nova-cell1-conductor-0\" (UID: \"913743df-f049-4011-bbff-2d7abf043bf3\") " pod="openstack/nova-cell1-conductor-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.232017 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/913743df-f049-4011-bbff-2d7abf043bf3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"913743df-f049-4011-bbff-2d7abf043bf3\") " pod="openstack/nova-cell1-conductor-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.232072 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/913743df-f049-4011-bbff-2d7abf043bf3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"913743df-f049-4011-bbff-2d7abf043bf3\") " pod="openstack/nova-cell1-conductor-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.232255 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-logs" (OuterVolumeSpecName: "logs") pod "0a6cebd0-4027-4eba-b6fc-67f3b9aea252" (UID: "0a6cebd0-4027-4eba-b6fc-67f3b9aea252"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.239705 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-kube-api-access-n2csh" (OuterVolumeSpecName: "kube-api-access-n2csh") pod "0a6cebd0-4027-4eba-b6fc-67f3b9aea252" (UID: "0a6cebd0-4027-4eba-b6fc-67f3b9aea252"). InnerVolumeSpecName "kube-api-access-n2csh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.243765 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/913743df-f049-4011-bbff-2d7abf043bf3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"913743df-f049-4011-bbff-2d7abf043bf3\") " pod="openstack/nova-cell1-conductor-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.244314 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/913743df-f049-4011-bbff-2d7abf043bf3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"913743df-f049-4011-bbff-2d7abf043bf3\") " pod="openstack/nova-cell1-conductor-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.259307 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwbv5\" (UniqueName: \"kubernetes.io/projected/913743df-f049-4011-bbff-2d7abf043bf3-kube-api-access-mwbv5\") pod \"nova-cell1-conductor-0\" (UID: \"913743df-f049-4011-bbff-2d7abf043bf3\") " pod="openstack/nova-cell1-conductor-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.288183 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a6cebd0-4027-4eba-b6fc-67f3b9aea252" (UID: "0a6cebd0-4027-4eba-b6fc-67f3b9aea252"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.289298 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-config-data" (OuterVolumeSpecName: "config-data") pod "0a6cebd0-4027-4eba-b6fc-67f3b9aea252" (UID: "0a6cebd0-4027-4eba-b6fc-67f3b9aea252"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.328524 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "0a6cebd0-4027-4eba-b6fc-67f3b9aea252" (UID: "0a6cebd0-4027-4eba-b6fc-67f3b9aea252"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.333668 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.333707 4727 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-logs\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.333717 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2csh\" (UniqueName: \"kubernetes.io/projected/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-kube-api-access-n2csh\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.333726 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.333735 4727 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a6cebd0-4027-4eba-b6fc-67f3b9aea252-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.382830 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0" path="/var/lib/kubelet/pods/dfdfa5a1-13c7-4d61-aeaf-41c118e6cfd0/volumes" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.435099 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.448647 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.458138 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.467168 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.470711 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.474599 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.475057 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.485990 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.648963 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c077ba19-0c1d-469a-8614-90ec0aa263ba-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c077ba19-0c1d-469a-8614-90ec0aa263ba\") " pod="openstack/nova-metadata-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.649375 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c077ba19-0c1d-469a-8614-90ec0aa263ba-logs\") pod \"nova-metadata-0\" (UID: \"c077ba19-0c1d-469a-8614-90ec0aa263ba\") " pod="openstack/nova-metadata-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.649413 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c077ba19-0c1d-469a-8614-90ec0aa263ba-config-data\") pod \"nova-metadata-0\" (UID: \"c077ba19-0c1d-469a-8614-90ec0aa263ba\") " pod="openstack/nova-metadata-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.649471 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c077ba19-0c1d-469a-8614-90ec0aa263ba-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c077ba19-0c1d-469a-8614-90ec0aa263ba\") " pod="openstack/nova-metadata-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.649541 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmmbz\" (UniqueName: \"kubernetes.io/projected/c077ba19-0c1d-469a-8614-90ec0aa263ba-kube-api-access-kmmbz\") pod \"nova-metadata-0\" (UID: \"c077ba19-0c1d-469a-8614-90ec0aa263ba\") " pod="openstack/nova-metadata-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.750898 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c077ba19-0c1d-469a-8614-90ec0aa263ba-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c077ba19-0c1d-469a-8614-90ec0aa263ba\") " pod="openstack/nova-metadata-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.750965 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c077ba19-0c1d-469a-8614-90ec0aa263ba-logs\") pod \"nova-metadata-0\" (UID: \"c077ba19-0c1d-469a-8614-90ec0aa263ba\") " pod="openstack/nova-metadata-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.751038 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c077ba19-0c1d-469a-8614-90ec0aa263ba-config-data\") pod \"nova-metadata-0\" (UID: \"c077ba19-0c1d-469a-8614-90ec0aa263ba\") " pod="openstack/nova-metadata-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.751093 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c077ba19-0c1d-469a-8614-90ec0aa263ba-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c077ba19-0c1d-469a-8614-90ec0aa263ba\") " pod="openstack/nova-metadata-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.751153 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmmbz\" (UniqueName: \"kubernetes.io/projected/c077ba19-0c1d-469a-8614-90ec0aa263ba-kube-api-access-kmmbz\") pod \"nova-metadata-0\" (UID: \"c077ba19-0c1d-469a-8614-90ec0aa263ba\") " pod="openstack/nova-metadata-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.753135 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c077ba19-0c1d-469a-8614-90ec0aa263ba-logs\") pod \"nova-metadata-0\" (UID: \"c077ba19-0c1d-469a-8614-90ec0aa263ba\") " pod="openstack/nova-metadata-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.758510 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c077ba19-0c1d-469a-8614-90ec0aa263ba-config-data\") pod \"nova-metadata-0\" (UID: \"c077ba19-0c1d-469a-8614-90ec0aa263ba\") " pod="openstack/nova-metadata-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.758655 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c077ba19-0c1d-469a-8614-90ec0aa263ba-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c077ba19-0c1d-469a-8614-90ec0aa263ba\") " pod="openstack/nova-metadata-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.760885 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c077ba19-0c1d-469a-8614-90ec0aa263ba-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c077ba19-0c1d-469a-8614-90ec0aa263ba\") " pod="openstack/nova-metadata-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.771822 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmmbz\" (UniqueName: \"kubernetes.io/projected/c077ba19-0c1d-469a-8614-90ec0aa263ba-kube-api-access-kmmbz\") pod \"nova-metadata-0\" (UID: \"c077ba19-0c1d-469a-8614-90ec0aa263ba\") " pod="openstack/nova-metadata-0" Oct 01 12:55:30 crc kubenswrapper[4727]: I1001 12:55:30.808184 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 12:55:31 crc kubenswrapper[4727]: I1001 12:55:31.029310 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 01 12:55:31 crc kubenswrapper[4727]: I1001 12:55:31.129902 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"913743df-f049-4011-bbff-2d7abf043bf3","Type":"ContainerStarted","Data":"ec067d7412674a84194f78f8fa861d13f47347ce212b82ede240dc94b7998ad2"} Oct 01 12:55:31 crc kubenswrapper[4727]: I1001 12:55:31.132157 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="bdce9668-f762-4b0d-b33a-76a1b270c575" containerName="nova-scheduler-scheduler" containerID="cri-o://0b8723d2f029ae92faa1986ae123176ba068aa6360323ae4f35507518e8fecc2" gracePeriod=30 Oct 01 12:55:31 crc kubenswrapper[4727]: I1001 12:55:31.132384 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27b755da-d064-4481-b856-4b51bb15cecb","Type":"ContainerStarted","Data":"c16e2f9b8424ed52750b02ed699a36ac6a57f4a0415d76bbc27179cdc0701b06"} Oct 01 12:55:31 crc kubenswrapper[4727]: I1001 12:55:31.278250 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:55:32 crc kubenswrapper[4727]: I1001 12:55:32.142821 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"913743df-f049-4011-bbff-2d7abf043bf3","Type":"ContainerStarted","Data":"b2fbb4a2de590a3fbe381c618863841ea063eec68bf10e87f43d4f003f98f1f7"} Oct 01 12:55:32 crc kubenswrapper[4727]: I1001 12:55:32.144495 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 01 12:55:32 crc kubenswrapper[4727]: I1001 12:55:32.147362 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c077ba19-0c1d-469a-8614-90ec0aa263ba","Type":"ContainerStarted","Data":"02290efc5268733d01bfd3357aee6706849eff14f8b80eb9f7f3bbb445774acc"} Oct 01 12:55:32 crc kubenswrapper[4727]: I1001 12:55:32.147427 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c077ba19-0c1d-469a-8614-90ec0aa263ba","Type":"ContainerStarted","Data":"2f7a3c16260c74c057b6dbc39427c0d0c3462479ed2bffb33d9f5ed26df93269"} Oct 01 12:55:32 crc kubenswrapper[4727]: I1001 12:55:32.147438 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c077ba19-0c1d-469a-8614-90ec0aa263ba","Type":"ContainerStarted","Data":"7f21797b7672e143e0584fa65b55ab7821a2f84c47a689c58456ad08eee7b2f5"} Oct 01 12:55:32 crc kubenswrapper[4727]: I1001 12:55:32.164825 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.164804415 podStartE2EDuration="2.164804415s" podCreationTimestamp="2025-10-01 12:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:32.158736208 +0000 UTC m=+1110.480091045" watchObservedRunningTime="2025-10-01 12:55:32.164804415 +0000 UTC m=+1110.486159262" Oct 01 12:55:32 crc kubenswrapper[4727]: I1001 12:55:32.186317 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.186295766 podStartE2EDuration="2.186295766s" podCreationTimestamp="2025-10-01 12:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:32.178885122 +0000 UTC m=+1110.500239979" watchObservedRunningTime="2025-10-01 12:55:32.186295766 +0000 UTC m=+1110.507650603" Oct 01 12:55:32 crc kubenswrapper[4727]: I1001 12:55:32.393535 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a6cebd0-4027-4eba-b6fc-67f3b9aea252" path="/var/lib/kubelet/pods/0a6cebd0-4027-4eba-b6fc-67f3b9aea252/volumes" Oct 01 12:55:32 crc kubenswrapper[4727]: E1001 12:55:32.965957 4727 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0b8723d2f029ae92faa1986ae123176ba068aa6360323ae4f35507518e8fecc2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 12:55:32 crc kubenswrapper[4727]: E1001 12:55:32.968509 4727 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0b8723d2f029ae92faa1986ae123176ba068aa6360323ae4f35507518e8fecc2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 12:55:32 crc kubenswrapper[4727]: E1001 12:55:32.970310 4727 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0b8723d2f029ae92faa1986ae123176ba068aa6360323ae4f35507518e8fecc2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 12:55:32 crc kubenswrapper[4727]: E1001 12:55:32.970382 4727 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="bdce9668-f762-4b0d-b33a-76a1b270c575" containerName="nova-scheduler-scheduler" Oct 01 12:55:33 crc kubenswrapper[4727]: I1001 12:55:33.156730 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27b755da-d064-4481-b856-4b51bb15cecb","Type":"ContainerStarted","Data":"e135f691e09488d4250bfad3220af7049a0e6f679595beabf482a08dd683b6ca"} Oct 01 12:55:33 crc kubenswrapper[4727]: I1001 12:55:33.157674 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 12:55:33 crc kubenswrapper[4727]: I1001 12:55:33.189265 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.34121273 podStartE2EDuration="7.189243359s" podCreationTimestamp="2025-10-01 12:55:26 +0000 UTC" firstStartedPulling="2025-10-01 12:55:27.749365575 +0000 UTC m=+1106.070720412" lastFinishedPulling="2025-10-01 12:55:32.597396204 +0000 UTC m=+1110.918751041" observedRunningTime="2025-10-01 12:55:33.17403101 +0000 UTC m=+1111.495385867" watchObservedRunningTime="2025-10-01 12:55:33.189243359 +0000 UTC m=+1111.510598196" Oct 01 12:55:33 crc kubenswrapper[4727]: I1001 12:55:33.291845 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:55:33 crc kubenswrapper[4727]: I1001 12:55:33.291929 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:55:34 crc kubenswrapper[4727]: I1001 12:55:34.174535 4727 generic.go:334] "Generic (PLEG): container finished" podID="bdce9668-f762-4b0d-b33a-76a1b270c575" containerID="0b8723d2f029ae92faa1986ae123176ba068aa6360323ae4f35507518e8fecc2" exitCode=0 Oct 01 12:55:34 crc kubenswrapper[4727]: I1001 12:55:34.174625 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bdce9668-f762-4b0d-b33a-76a1b270c575","Type":"ContainerDied","Data":"0b8723d2f029ae92faa1986ae123176ba068aa6360323ae4f35507518e8fecc2"} Oct 01 12:55:34 crc kubenswrapper[4727]: I1001 12:55:34.361619 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 12:55:34 crc kubenswrapper[4727]: I1001 12:55:34.524975 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdce9668-f762-4b0d-b33a-76a1b270c575-config-data\") pod \"bdce9668-f762-4b0d-b33a-76a1b270c575\" (UID: \"bdce9668-f762-4b0d-b33a-76a1b270c575\") " Oct 01 12:55:34 crc kubenswrapper[4727]: I1001 12:55:34.525146 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdce9668-f762-4b0d-b33a-76a1b270c575-combined-ca-bundle\") pod \"bdce9668-f762-4b0d-b33a-76a1b270c575\" (UID: \"bdce9668-f762-4b0d-b33a-76a1b270c575\") " Oct 01 12:55:34 crc kubenswrapper[4727]: I1001 12:55:34.525289 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhxc2\" (UniqueName: \"kubernetes.io/projected/bdce9668-f762-4b0d-b33a-76a1b270c575-kube-api-access-bhxc2\") pod \"bdce9668-f762-4b0d-b33a-76a1b270c575\" (UID: \"bdce9668-f762-4b0d-b33a-76a1b270c575\") " Oct 01 12:55:34 crc kubenswrapper[4727]: I1001 12:55:34.546351 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdce9668-f762-4b0d-b33a-76a1b270c575-kube-api-access-bhxc2" (OuterVolumeSpecName: "kube-api-access-bhxc2") pod "bdce9668-f762-4b0d-b33a-76a1b270c575" (UID: "bdce9668-f762-4b0d-b33a-76a1b270c575"). InnerVolumeSpecName "kube-api-access-bhxc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:34 crc kubenswrapper[4727]: I1001 12:55:34.558546 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdce9668-f762-4b0d-b33a-76a1b270c575-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bdce9668-f762-4b0d-b33a-76a1b270c575" (UID: "bdce9668-f762-4b0d-b33a-76a1b270c575"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:34 crc kubenswrapper[4727]: I1001 12:55:34.563929 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdce9668-f762-4b0d-b33a-76a1b270c575-config-data" (OuterVolumeSpecName: "config-data") pod "bdce9668-f762-4b0d-b33a-76a1b270c575" (UID: "bdce9668-f762-4b0d-b33a-76a1b270c575"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:34 crc kubenswrapper[4727]: I1001 12:55:34.628495 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdce9668-f762-4b0d-b33a-76a1b270c575-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:34 crc kubenswrapper[4727]: I1001 12:55:34.628530 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdce9668-f762-4b0d-b33a-76a1b270c575-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:34 crc kubenswrapper[4727]: I1001 12:55:34.628542 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhxc2\" (UniqueName: \"kubernetes.io/projected/bdce9668-f762-4b0d-b33a-76a1b270c575-kube-api-access-bhxc2\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:35 crc kubenswrapper[4727]: I1001 12:55:35.186359 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bdce9668-f762-4b0d-b33a-76a1b270c575","Type":"ContainerDied","Data":"7a0ac7af122ed651b01226eb40bce6c1e1c9bb7685dc06317be6eceb0aa141e0"} Oct 01 12:55:35 crc kubenswrapper[4727]: I1001 12:55:35.186453 4727 scope.go:117] "RemoveContainer" containerID="0b8723d2f029ae92faa1986ae123176ba068aa6360323ae4f35507518e8fecc2" Oct 01 12:55:35 crc kubenswrapper[4727]: I1001 12:55:35.186400 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 12:55:35 crc kubenswrapper[4727]: I1001 12:55:35.229041 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:55:35 crc kubenswrapper[4727]: I1001 12:55:35.252412 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:55:35 crc kubenswrapper[4727]: I1001 12:55:35.260350 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:55:35 crc kubenswrapper[4727]: E1001 12:55:35.260936 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdce9668-f762-4b0d-b33a-76a1b270c575" containerName="nova-scheduler-scheduler" Oct 01 12:55:35 crc kubenswrapper[4727]: I1001 12:55:35.260962 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdce9668-f762-4b0d-b33a-76a1b270c575" containerName="nova-scheduler-scheduler" Oct 01 12:55:35 crc kubenswrapper[4727]: I1001 12:55:35.261340 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdce9668-f762-4b0d-b33a-76a1b270c575" containerName="nova-scheduler-scheduler" Oct 01 12:55:35 crc kubenswrapper[4727]: I1001 12:55:35.262264 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 12:55:35 crc kubenswrapper[4727]: I1001 12:55:35.268057 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 01 12:55:35 crc kubenswrapper[4727]: I1001 12:55:35.278155 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:55:35 crc kubenswrapper[4727]: I1001 12:55:35.442260 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/228ffba7-ed93-43c8-b1df-c9c68c337461-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"228ffba7-ed93-43c8-b1df-c9c68c337461\") " pod="openstack/nova-scheduler-0" Oct 01 12:55:35 crc kubenswrapper[4727]: I1001 12:55:35.442376 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4kns\" (UniqueName: \"kubernetes.io/projected/228ffba7-ed93-43c8-b1df-c9c68c337461-kube-api-access-k4kns\") pod \"nova-scheduler-0\" (UID: \"228ffba7-ed93-43c8-b1df-c9c68c337461\") " pod="openstack/nova-scheduler-0" Oct 01 12:55:35 crc kubenswrapper[4727]: I1001 12:55:35.442425 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/228ffba7-ed93-43c8-b1df-c9c68c337461-config-data\") pod \"nova-scheduler-0\" (UID: \"228ffba7-ed93-43c8-b1df-c9c68c337461\") " pod="openstack/nova-scheduler-0" Oct 01 12:55:35 crc kubenswrapper[4727]: I1001 12:55:35.545230 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/228ffba7-ed93-43c8-b1df-c9c68c337461-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"228ffba7-ed93-43c8-b1df-c9c68c337461\") " pod="openstack/nova-scheduler-0" Oct 01 12:55:35 crc kubenswrapper[4727]: I1001 12:55:35.545331 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4kns\" (UniqueName: \"kubernetes.io/projected/228ffba7-ed93-43c8-b1df-c9c68c337461-kube-api-access-k4kns\") pod \"nova-scheduler-0\" (UID: \"228ffba7-ed93-43c8-b1df-c9c68c337461\") " pod="openstack/nova-scheduler-0" Oct 01 12:55:35 crc kubenswrapper[4727]: I1001 12:55:35.545382 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/228ffba7-ed93-43c8-b1df-c9c68c337461-config-data\") pod \"nova-scheduler-0\" (UID: \"228ffba7-ed93-43c8-b1df-c9c68c337461\") " pod="openstack/nova-scheduler-0" Oct 01 12:55:35 crc kubenswrapper[4727]: I1001 12:55:35.554878 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/228ffba7-ed93-43c8-b1df-c9c68c337461-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"228ffba7-ed93-43c8-b1df-c9c68c337461\") " pod="openstack/nova-scheduler-0" Oct 01 12:55:35 crc kubenswrapper[4727]: I1001 12:55:35.559635 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/228ffba7-ed93-43c8-b1df-c9c68c337461-config-data\") pod \"nova-scheduler-0\" (UID: \"228ffba7-ed93-43c8-b1df-c9c68c337461\") " pod="openstack/nova-scheduler-0" Oct 01 12:55:35 crc kubenswrapper[4727]: I1001 12:55:35.561348 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4kns\" (UniqueName: \"kubernetes.io/projected/228ffba7-ed93-43c8-b1df-c9c68c337461-kube-api-access-k4kns\") pod \"nova-scheduler-0\" (UID: \"228ffba7-ed93-43c8-b1df-c9c68c337461\") " pod="openstack/nova-scheduler-0" Oct 01 12:55:35 crc kubenswrapper[4727]: I1001 12:55:35.587710 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 12:55:35 crc kubenswrapper[4727]: I1001 12:55:35.809308 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 12:55:35 crc kubenswrapper[4727]: I1001 12:55:35.809655 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.006924 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.125344 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:55:36 crc kubenswrapper[4727]: W1001 12:55:36.126165 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod228ffba7_ed93_43c8_b1df_c9c68c337461.slice/crio-05e117d5b5718e5a3b481cefa664a8769616fb3d532d3fc1871e8d5770ddf384 WatchSource:0}: Error finding container 05e117d5b5718e5a3b481cefa664a8769616fb3d532d3fc1871e8d5770ddf384: Status 404 returned error can't find the container with id 05e117d5b5718e5a3b481cefa664a8769616fb3d532d3fc1871e8d5770ddf384 Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.156646 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f46d91a0-8be6-46bc-a325-5b4ccc433a6d-config-data\") pod \"f46d91a0-8be6-46bc-a325-5b4ccc433a6d\" (UID: \"f46d91a0-8be6-46bc-a325-5b4ccc433a6d\") " Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.156704 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f46d91a0-8be6-46bc-a325-5b4ccc433a6d-combined-ca-bundle\") pod \"f46d91a0-8be6-46bc-a325-5b4ccc433a6d\" (UID: \"f46d91a0-8be6-46bc-a325-5b4ccc433a6d\") " Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.156759 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f46d91a0-8be6-46bc-a325-5b4ccc433a6d-logs\") pod \"f46d91a0-8be6-46bc-a325-5b4ccc433a6d\" (UID: \"f46d91a0-8be6-46bc-a325-5b4ccc433a6d\") " Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.156845 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs9hk\" (UniqueName: \"kubernetes.io/projected/f46d91a0-8be6-46bc-a325-5b4ccc433a6d-kube-api-access-cs9hk\") pod \"f46d91a0-8be6-46bc-a325-5b4ccc433a6d\" (UID: \"f46d91a0-8be6-46bc-a325-5b4ccc433a6d\") " Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.157321 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f46d91a0-8be6-46bc-a325-5b4ccc433a6d-logs" (OuterVolumeSpecName: "logs") pod "f46d91a0-8be6-46bc-a325-5b4ccc433a6d" (UID: "f46d91a0-8be6-46bc-a325-5b4ccc433a6d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.162896 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f46d91a0-8be6-46bc-a325-5b4ccc433a6d-kube-api-access-cs9hk" (OuterVolumeSpecName: "kube-api-access-cs9hk") pod "f46d91a0-8be6-46bc-a325-5b4ccc433a6d" (UID: "f46d91a0-8be6-46bc-a325-5b4ccc433a6d"). InnerVolumeSpecName "kube-api-access-cs9hk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.188727 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f46d91a0-8be6-46bc-a325-5b4ccc433a6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f46d91a0-8be6-46bc-a325-5b4ccc433a6d" (UID: "f46d91a0-8be6-46bc-a325-5b4ccc433a6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.203408 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f46d91a0-8be6-46bc-a325-5b4ccc433a6d-config-data" (OuterVolumeSpecName: "config-data") pod "f46d91a0-8be6-46bc-a325-5b4ccc433a6d" (UID: "f46d91a0-8be6-46bc-a325-5b4ccc433a6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.206647 4727 generic.go:334] "Generic (PLEG): container finished" podID="f46d91a0-8be6-46bc-a325-5b4ccc433a6d" containerID="55da78ffe0a2749bc6f01fb0405759347e1cc7b7b7dbb96c3beca0299291f6cf" exitCode=0 Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.206726 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.206746 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f46d91a0-8be6-46bc-a325-5b4ccc433a6d","Type":"ContainerDied","Data":"55da78ffe0a2749bc6f01fb0405759347e1cc7b7b7dbb96c3beca0299291f6cf"} Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.206784 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f46d91a0-8be6-46bc-a325-5b4ccc433a6d","Type":"ContainerDied","Data":"b43b2076af8dd0e5467155ae61e62f38b812ad0da0378a245aaa17ba4be7789d"} Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.206809 4727 scope.go:117] "RemoveContainer" containerID="55da78ffe0a2749bc6f01fb0405759347e1cc7b7b7dbb96c3beca0299291f6cf" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.208849 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"228ffba7-ed93-43c8-b1df-c9c68c337461","Type":"ContainerStarted","Data":"05e117d5b5718e5a3b481cefa664a8769616fb3d532d3fc1871e8d5770ddf384"} Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.244640 4727 scope.go:117] "RemoveContainer" containerID="c169c2b31bac92a0ac5a23c185257e82c227955360c2bbe1c26449b69819cf35" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.250631 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.258831 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f46d91a0-8be6-46bc-a325-5b4ccc433a6d-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.258872 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f46d91a0-8be6-46bc-a325-5b4ccc433a6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.258884 4727 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f46d91a0-8be6-46bc-a325-5b4ccc433a6d-logs\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.258895 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cs9hk\" (UniqueName: \"kubernetes.io/projected/f46d91a0-8be6-46bc-a325-5b4ccc433a6d-kube-api-access-cs9hk\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.265180 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.274126 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 12:55:36 crc kubenswrapper[4727]: E1001 12:55:36.274601 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f46d91a0-8be6-46bc-a325-5b4ccc433a6d" containerName="nova-api-log" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.274625 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="f46d91a0-8be6-46bc-a325-5b4ccc433a6d" containerName="nova-api-log" Oct 01 12:55:36 crc kubenswrapper[4727]: E1001 12:55:36.274650 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f46d91a0-8be6-46bc-a325-5b4ccc433a6d" containerName="nova-api-api" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.274659 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="f46d91a0-8be6-46bc-a325-5b4ccc433a6d" containerName="nova-api-api" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.274903 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="f46d91a0-8be6-46bc-a325-5b4ccc433a6d" containerName="nova-api-api" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.274930 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="f46d91a0-8be6-46bc-a325-5b4ccc433a6d" containerName="nova-api-log" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.276699 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.277444 4727 scope.go:117] "RemoveContainer" containerID="55da78ffe0a2749bc6f01fb0405759347e1cc7b7b7dbb96c3beca0299291f6cf" Oct 01 12:55:36 crc kubenswrapper[4727]: E1001 12:55:36.277761 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55da78ffe0a2749bc6f01fb0405759347e1cc7b7b7dbb96c3beca0299291f6cf\": container with ID starting with 55da78ffe0a2749bc6f01fb0405759347e1cc7b7b7dbb96c3beca0299291f6cf not found: ID does not exist" containerID="55da78ffe0a2749bc6f01fb0405759347e1cc7b7b7dbb96c3beca0299291f6cf" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.277800 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55da78ffe0a2749bc6f01fb0405759347e1cc7b7b7dbb96c3beca0299291f6cf"} err="failed to get container status \"55da78ffe0a2749bc6f01fb0405759347e1cc7b7b7dbb96c3beca0299291f6cf\": rpc error: code = NotFound desc = could not find container \"55da78ffe0a2749bc6f01fb0405759347e1cc7b7b7dbb96c3beca0299291f6cf\": container with ID starting with 55da78ffe0a2749bc6f01fb0405759347e1cc7b7b7dbb96c3beca0299291f6cf not found: ID does not exist" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.277827 4727 scope.go:117] "RemoveContainer" containerID="c169c2b31bac92a0ac5a23c185257e82c227955360c2bbe1c26449b69819cf35" Oct 01 12:55:36 crc kubenswrapper[4727]: E1001 12:55:36.278059 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c169c2b31bac92a0ac5a23c185257e82c227955360c2bbe1c26449b69819cf35\": container with ID starting with c169c2b31bac92a0ac5a23c185257e82c227955360c2bbe1c26449b69819cf35 not found: ID does not exist" containerID="c169c2b31bac92a0ac5a23c185257e82c227955360c2bbe1c26449b69819cf35" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.278086 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c169c2b31bac92a0ac5a23c185257e82c227955360c2bbe1c26449b69819cf35"} err="failed to get container status \"c169c2b31bac92a0ac5a23c185257e82c227955360c2bbe1c26449b69819cf35\": rpc error: code = NotFound desc = could not find container \"c169c2b31bac92a0ac5a23c185257e82c227955360c2bbe1c26449b69819cf35\": container with ID starting with c169c2b31bac92a0ac5a23c185257e82c227955360c2bbe1c26449b69819cf35 not found: ID does not exist" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.281334 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.285533 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.384984 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdce9668-f762-4b0d-b33a-76a1b270c575" path="/var/lib/kubelet/pods/bdce9668-f762-4b0d-b33a-76a1b270c575/volumes" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.386170 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f46d91a0-8be6-46bc-a325-5b4ccc433a6d" path="/var/lib/kubelet/pods/f46d91a0-8be6-46bc-a325-5b4ccc433a6d/volumes" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.463138 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5275g\" (UniqueName: \"kubernetes.io/projected/d8db6f96-22e0-4d30-a612-29549be3b024-kube-api-access-5275g\") pod \"nova-api-0\" (UID: \"d8db6f96-22e0-4d30-a612-29549be3b024\") " pod="openstack/nova-api-0" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.463604 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8db6f96-22e0-4d30-a612-29549be3b024-config-data\") pod \"nova-api-0\" (UID: \"d8db6f96-22e0-4d30-a612-29549be3b024\") " pod="openstack/nova-api-0" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.463641 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8db6f96-22e0-4d30-a612-29549be3b024-logs\") pod \"nova-api-0\" (UID: \"d8db6f96-22e0-4d30-a612-29549be3b024\") " pod="openstack/nova-api-0" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.463703 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8db6f96-22e0-4d30-a612-29549be3b024-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d8db6f96-22e0-4d30-a612-29549be3b024\") " pod="openstack/nova-api-0" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.566209 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8db6f96-22e0-4d30-a612-29549be3b024-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d8db6f96-22e0-4d30-a612-29549be3b024\") " pod="openstack/nova-api-0" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.566363 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5275g\" (UniqueName: \"kubernetes.io/projected/d8db6f96-22e0-4d30-a612-29549be3b024-kube-api-access-5275g\") pod \"nova-api-0\" (UID: \"d8db6f96-22e0-4d30-a612-29549be3b024\") " pod="openstack/nova-api-0" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.566455 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8db6f96-22e0-4d30-a612-29549be3b024-config-data\") pod \"nova-api-0\" (UID: \"d8db6f96-22e0-4d30-a612-29549be3b024\") " pod="openstack/nova-api-0" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.566486 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8db6f96-22e0-4d30-a612-29549be3b024-logs\") pod \"nova-api-0\" (UID: \"d8db6f96-22e0-4d30-a612-29549be3b024\") " pod="openstack/nova-api-0" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.567620 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8db6f96-22e0-4d30-a612-29549be3b024-logs\") pod \"nova-api-0\" (UID: \"d8db6f96-22e0-4d30-a612-29549be3b024\") " pod="openstack/nova-api-0" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.570286 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8db6f96-22e0-4d30-a612-29549be3b024-config-data\") pod \"nova-api-0\" (UID: \"d8db6f96-22e0-4d30-a612-29549be3b024\") " pod="openstack/nova-api-0" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.574858 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8db6f96-22e0-4d30-a612-29549be3b024-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d8db6f96-22e0-4d30-a612-29549be3b024\") " pod="openstack/nova-api-0" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.583531 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5275g\" (UniqueName: \"kubernetes.io/projected/d8db6f96-22e0-4d30-a612-29549be3b024-kube-api-access-5275g\") pod \"nova-api-0\" (UID: \"d8db6f96-22e0-4d30-a612-29549be3b024\") " pod="openstack/nova-api-0" Oct 01 12:55:36 crc kubenswrapper[4727]: I1001 12:55:36.606133 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:55:37 crc kubenswrapper[4727]: I1001 12:55:37.089902 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:55:37 crc kubenswrapper[4727]: I1001 12:55:37.219822 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"228ffba7-ed93-43c8-b1df-c9c68c337461","Type":"ContainerStarted","Data":"1ec5510902c13ae349866ca0c59a0826811487cf418fe9e0fc2f710a6e97e249"} Oct 01 12:55:37 crc kubenswrapper[4727]: I1001 12:55:37.221936 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8db6f96-22e0-4d30-a612-29549be3b024","Type":"ContainerStarted","Data":"b315dfc3e98da863512bd37962e837b543d7829bb683335dfa70d2cd55f31e83"} Oct 01 12:55:37 crc kubenswrapper[4727]: I1001 12:55:37.245550 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.245530671 podStartE2EDuration="2.245530671s" podCreationTimestamp="2025-10-01 12:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:37.235953467 +0000 UTC m=+1115.557308314" watchObservedRunningTime="2025-10-01 12:55:37.245530671 +0000 UTC m=+1115.566885508" Oct 01 12:55:38 crc kubenswrapper[4727]: I1001 12:55:38.234875 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8db6f96-22e0-4d30-a612-29549be3b024","Type":"ContainerStarted","Data":"bde41fbb3ecfc787e26d13c540953fbcd3ce1e32aa7befbee54f1c0e073eeaa1"} Oct 01 12:55:38 crc kubenswrapper[4727]: I1001 12:55:38.235255 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8db6f96-22e0-4d30-a612-29549be3b024","Type":"ContainerStarted","Data":"13ba751d270f384802727666a281784a1835d3d638074590f17bd41665faefe4"} Oct 01 12:55:38 crc kubenswrapper[4727]: I1001 12:55:38.251293 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.25127278 podStartE2EDuration="2.25127278s" podCreationTimestamp="2025-10-01 12:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:38.250569971 +0000 UTC m=+1116.571924828" watchObservedRunningTime="2025-10-01 12:55:38.25127278 +0000 UTC m=+1116.572627617" Oct 01 12:55:40 crc kubenswrapper[4727]: I1001 12:55:40.484518 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 01 12:55:40 crc kubenswrapper[4727]: I1001 12:55:40.588726 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 01 12:55:40 crc kubenswrapper[4727]: I1001 12:55:40.810028 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 12:55:40 crc kubenswrapper[4727]: I1001 12:55:40.810075 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 12:55:41 crc kubenswrapper[4727]: I1001 12:55:41.824180 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c077ba19-0c1d-469a-8614-90ec0aa263ba" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 12:55:41 crc kubenswrapper[4727]: I1001 12:55:41.824198 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c077ba19-0c1d-469a-8614-90ec0aa263ba" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 12:55:45 crc kubenswrapper[4727]: I1001 12:55:45.589031 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 01 12:55:45 crc kubenswrapper[4727]: I1001 12:55:45.626741 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 01 12:55:46 crc kubenswrapper[4727]: I1001 12:55:46.344592 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 01 12:55:46 crc kubenswrapper[4727]: I1001 12:55:46.607337 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 12:55:46 crc kubenswrapper[4727]: I1001 12:55:46.607929 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 12:55:47 crc kubenswrapper[4727]: I1001 12:55:47.689224 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d8db6f96-22e0-4d30-a612-29549be3b024" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 12:55:47 crc kubenswrapper[4727]: I1001 12:55:47.689311 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d8db6f96-22e0-4d30-a612-29549be3b024" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 12:55:50 crc kubenswrapper[4727]: I1001 12:55:50.815266 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 12:55:50 crc kubenswrapper[4727]: I1001 12:55:50.817715 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 12:55:50 crc kubenswrapper[4727]: I1001 12:55:50.820158 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 12:55:51 crc kubenswrapper[4727]: I1001 12:55:51.365527 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.279432 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.372934 4727 generic.go:334] "Generic (PLEG): container finished" podID="8e55ab75-534e-4cf9-9a5c-58d5da07ad7b" containerID="0be4c41c5e99aa80ea425c5ff6095c6bdfafaf786898f113cfddee6d74397b24" exitCode=137 Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.372975 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.373035 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8e55ab75-534e-4cf9-9a5c-58d5da07ad7b","Type":"ContainerDied","Data":"0be4c41c5e99aa80ea425c5ff6095c6bdfafaf786898f113cfddee6d74397b24"} Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.373077 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8e55ab75-534e-4cf9-9a5c-58d5da07ad7b","Type":"ContainerDied","Data":"44fdf3e6c0c1dc06d767f6bba2133a535605acadad2d24c7ab3e5e28ae01fc39"} Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.373094 4727 scope.go:117] "RemoveContainer" containerID="0be4c41c5e99aa80ea425c5ff6095c6bdfafaf786898f113cfddee6d74397b24" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.396635 4727 scope.go:117] "RemoveContainer" containerID="0be4c41c5e99aa80ea425c5ff6095c6bdfafaf786898f113cfddee6d74397b24" Oct 01 12:55:53 crc kubenswrapper[4727]: E1001 12:55:53.397142 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0be4c41c5e99aa80ea425c5ff6095c6bdfafaf786898f113cfddee6d74397b24\": container with ID starting with 0be4c41c5e99aa80ea425c5ff6095c6bdfafaf786898f113cfddee6d74397b24 not found: ID does not exist" containerID="0be4c41c5e99aa80ea425c5ff6095c6bdfafaf786898f113cfddee6d74397b24" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.397220 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0be4c41c5e99aa80ea425c5ff6095c6bdfafaf786898f113cfddee6d74397b24"} err="failed to get container status \"0be4c41c5e99aa80ea425c5ff6095c6bdfafaf786898f113cfddee6d74397b24\": rpc error: code = NotFound desc = could not find container \"0be4c41c5e99aa80ea425c5ff6095c6bdfafaf786898f113cfddee6d74397b24\": container with ID starting with 0be4c41c5e99aa80ea425c5ff6095c6bdfafaf786898f113cfddee6d74397b24 not found: ID does not exist" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.474051 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h5g4\" (UniqueName: \"kubernetes.io/projected/8e55ab75-534e-4cf9-9a5c-58d5da07ad7b-kube-api-access-2h5g4\") pod \"8e55ab75-534e-4cf9-9a5c-58d5da07ad7b\" (UID: \"8e55ab75-534e-4cf9-9a5c-58d5da07ad7b\") " Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.474199 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e55ab75-534e-4cf9-9a5c-58d5da07ad7b-config-data\") pod \"8e55ab75-534e-4cf9-9a5c-58d5da07ad7b\" (UID: \"8e55ab75-534e-4cf9-9a5c-58d5da07ad7b\") " Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.474448 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e55ab75-534e-4cf9-9a5c-58d5da07ad7b-combined-ca-bundle\") pod \"8e55ab75-534e-4cf9-9a5c-58d5da07ad7b\" (UID: \"8e55ab75-534e-4cf9-9a5c-58d5da07ad7b\") " Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.481739 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e55ab75-534e-4cf9-9a5c-58d5da07ad7b-kube-api-access-2h5g4" (OuterVolumeSpecName: "kube-api-access-2h5g4") pod "8e55ab75-534e-4cf9-9a5c-58d5da07ad7b" (UID: "8e55ab75-534e-4cf9-9a5c-58d5da07ad7b"). InnerVolumeSpecName "kube-api-access-2h5g4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.507644 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e55ab75-534e-4cf9-9a5c-58d5da07ad7b-config-data" (OuterVolumeSpecName: "config-data") pod "8e55ab75-534e-4cf9-9a5c-58d5da07ad7b" (UID: "8e55ab75-534e-4cf9-9a5c-58d5da07ad7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.530205 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e55ab75-534e-4cf9-9a5c-58d5da07ad7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e55ab75-534e-4cf9-9a5c-58d5da07ad7b" (UID: "8e55ab75-534e-4cf9-9a5c-58d5da07ad7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.578185 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e55ab75-534e-4cf9-9a5c-58d5da07ad7b-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.578218 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e55ab75-534e-4cf9-9a5c-58d5da07ad7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.578230 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h5g4\" (UniqueName: \"kubernetes.io/projected/8e55ab75-534e-4cf9-9a5c-58d5da07ad7b-kube-api-access-2h5g4\") on node \"crc\" DevicePath \"\"" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.708865 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.721668 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.731676 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 12:55:53 crc kubenswrapper[4727]: E1001 12:55:53.732282 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e55ab75-534e-4cf9-9a5c-58d5da07ad7b" containerName="nova-cell1-novncproxy-novncproxy" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.732308 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e55ab75-534e-4cf9-9a5c-58d5da07ad7b" containerName="nova-cell1-novncproxy-novncproxy" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.732545 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e55ab75-534e-4cf9-9a5c-58d5da07ad7b" containerName="nova-cell1-novncproxy-novncproxy" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.733351 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.735364 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.735501 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.737383 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.742371 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.882932 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0de3207c-19cd-4cb7-a637-642aa2127265-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0de3207c-19cd-4cb7-a637-642aa2127265\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.883046 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0de3207c-19cd-4cb7-a637-642aa2127265-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0de3207c-19cd-4cb7-a637-642aa2127265\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.883105 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0de3207c-19cd-4cb7-a637-642aa2127265-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0de3207c-19cd-4cb7-a637-642aa2127265\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.883131 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0de3207c-19cd-4cb7-a637-642aa2127265-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0de3207c-19cd-4cb7-a637-642aa2127265\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.883337 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjfns\" (UniqueName: \"kubernetes.io/projected/0de3207c-19cd-4cb7-a637-642aa2127265-kube-api-access-hjfns\") pod \"nova-cell1-novncproxy-0\" (UID: \"0de3207c-19cd-4cb7-a637-642aa2127265\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.985242 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0de3207c-19cd-4cb7-a637-642aa2127265-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0de3207c-19cd-4cb7-a637-642aa2127265\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.985334 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0de3207c-19cd-4cb7-a637-642aa2127265-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0de3207c-19cd-4cb7-a637-642aa2127265\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.985369 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0de3207c-19cd-4cb7-a637-642aa2127265-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0de3207c-19cd-4cb7-a637-642aa2127265\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.985393 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0de3207c-19cd-4cb7-a637-642aa2127265-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0de3207c-19cd-4cb7-a637-642aa2127265\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.985412 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjfns\" (UniqueName: \"kubernetes.io/projected/0de3207c-19cd-4cb7-a637-642aa2127265-kube-api-access-hjfns\") pod \"nova-cell1-novncproxy-0\" (UID: \"0de3207c-19cd-4cb7-a637-642aa2127265\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.989551 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0de3207c-19cd-4cb7-a637-642aa2127265-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0de3207c-19cd-4cb7-a637-642aa2127265\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.989569 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0de3207c-19cd-4cb7-a637-642aa2127265-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0de3207c-19cd-4cb7-a637-642aa2127265\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.992093 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0de3207c-19cd-4cb7-a637-642aa2127265-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0de3207c-19cd-4cb7-a637-642aa2127265\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:53 crc kubenswrapper[4727]: I1001 12:55:53.999041 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0de3207c-19cd-4cb7-a637-642aa2127265-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0de3207c-19cd-4cb7-a637-642aa2127265\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:54 crc kubenswrapper[4727]: I1001 12:55:54.001925 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjfns\" (UniqueName: \"kubernetes.io/projected/0de3207c-19cd-4cb7-a637-642aa2127265-kube-api-access-hjfns\") pod \"nova-cell1-novncproxy-0\" (UID: \"0de3207c-19cd-4cb7-a637-642aa2127265\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:54 crc kubenswrapper[4727]: I1001 12:55:54.055492 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:54 crc kubenswrapper[4727]: I1001 12:55:54.388048 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e55ab75-534e-4cf9-9a5c-58d5da07ad7b" path="/var/lib/kubelet/pods/8e55ab75-534e-4cf9-9a5c-58d5da07ad7b/volumes" Oct 01 12:55:54 crc kubenswrapper[4727]: I1001 12:55:54.465702 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 12:55:54 crc kubenswrapper[4727]: W1001 12:55:54.470464 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0de3207c_19cd_4cb7_a637_642aa2127265.slice/crio-e7e191b8b459f03f05938f279769142963dbeb4f4de5942d550ca45ec062af3c WatchSource:0}: Error finding container e7e191b8b459f03f05938f279769142963dbeb4f4de5942d550ca45ec062af3c: Status 404 returned error can't find the container with id e7e191b8b459f03f05938f279769142963dbeb4f4de5942d550ca45ec062af3c Oct 01 12:55:55 crc kubenswrapper[4727]: I1001 12:55:55.408473 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0de3207c-19cd-4cb7-a637-642aa2127265","Type":"ContainerStarted","Data":"b2fd0494ba6857501f9134dc31c10c1dc5a9d6a21f66df289eb17dadaa4cfc65"} Oct 01 12:55:55 crc kubenswrapper[4727]: I1001 12:55:55.408856 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0de3207c-19cd-4cb7-a637-642aa2127265","Type":"ContainerStarted","Data":"e7e191b8b459f03f05938f279769142963dbeb4f4de5942d550ca45ec062af3c"} Oct 01 12:55:56 crc kubenswrapper[4727]: I1001 12:55:56.611432 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 12:55:56 crc kubenswrapper[4727]: I1001 12:55:56.612181 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 12:55:56 crc kubenswrapper[4727]: I1001 12:55:56.612445 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 12:55:56 crc kubenswrapper[4727]: I1001 12:55:56.615699 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 12:55:56 crc kubenswrapper[4727]: I1001 12:55:56.635297 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.635275494 podStartE2EDuration="3.635275494s" podCreationTimestamp="2025-10-01 12:55:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:55:55.435984517 +0000 UTC m=+1133.757339364" watchObservedRunningTime="2025-10-01 12:55:56.635275494 +0000 UTC m=+1134.956630341" Oct 01 12:55:57 crc kubenswrapper[4727]: I1001 12:55:57.329906 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 01 12:55:57 crc kubenswrapper[4727]: I1001 12:55:57.425421 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 12:55:57 crc kubenswrapper[4727]: I1001 12:55:57.428623 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 12:55:57 crc kubenswrapper[4727]: I1001 12:55:57.596535 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-lzvjj"] Oct 01 12:55:57 crc kubenswrapper[4727]: I1001 12:55:57.598326 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" Oct 01 12:55:57 crc kubenswrapper[4727]: I1001 12:55:57.617155 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-lzvjj"] Oct 01 12:55:57 crc kubenswrapper[4727]: I1001 12:55:57.676781 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-config\") pod \"dnsmasq-dns-59cf4bdb65-lzvjj\" (UID: \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\") " pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" Oct 01 12:55:57 crc kubenswrapper[4727]: I1001 12:55:57.676870 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-lzvjj\" (UID: \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\") " pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" Oct 01 12:55:57 crc kubenswrapper[4727]: I1001 12:55:57.676924 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-lzvjj\" (UID: \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\") " pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" Oct 01 12:55:57 crc kubenswrapper[4727]: I1001 12:55:57.677216 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-lzvjj\" (UID: \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\") " pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" Oct 01 12:55:57 crc kubenswrapper[4727]: I1001 12:55:57.677530 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9bzf\" (UniqueName: \"kubernetes.io/projected/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-kube-api-access-p9bzf\") pod \"dnsmasq-dns-59cf4bdb65-lzvjj\" (UID: \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\") " pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" Oct 01 12:55:57 crc kubenswrapper[4727]: I1001 12:55:57.677647 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-lzvjj\" (UID: \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\") " pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" Oct 01 12:55:57 crc kubenswrapper[4727]: I1001 12:55:57.779632 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-config\") pod \"dnsmasq-dns-59cf4bdb65-lzvjj\" (UID: \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\") " pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" Oct 01 12:55:57 crc kubenswrapper[4727]: I1001 12:55:57.779711 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-lzvjj\" (UID: \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\") " pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" Oct 01 12:55:57 crc kubenswrapper[4727]: I1001 12:55:57.779770 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-lzvjj\" (UID: \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\") " pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" Oct 01 12:55:57 crc kubenswrapper[4727]: I1001 12:55:57.779827 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-lzvjj\" (UID: \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\") " pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" Oct 01 12:55:57 crc kubenswrapper[4727]: I1001 12:55:57.779910 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9bzf\" (UniqueName: \"kubernetes.io/projected/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-kube-api-access-p9bzf\") pod \"dnsmasq-dns-59cf4bdb65-lzvjj\" (UID: \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\") " pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" Oct 01 12:55:57 crc kubenswrapper[4727]: I1001 12:55:57.780016 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-lzvjj\" (UID: \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\") " pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" Oct 01 12:55:57 crc kubenswrapper[4727]: I1001 12:55:57.781057 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-lzvjj\" (UID: \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\") " pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" Oct 01 12:55:57 crc kubenswrapper[4727]: I1001 12:55:57.781176 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-lzvjj\" (UID: \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\") " pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" Oct 01 12:55:57 crc kubenswrapper[4727]: I1001 12:55:57.781385 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-config\") pod \"dnsmasq-dns-59cf4bdb65-lzvjj\" (UID: \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\") " pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" Oct 01 12:55:57 crc kubenswrapper[4727]: I1001 12:55:57.781404 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-lzvjj\" (UID: \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\") " pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" Oct 01 12:55:57 crc kubenswrapper[4727]: I1001 12:55:57.781430 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-lzvjj\" (UID: \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\") " pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" Oct 01 12:55:57 crc kubenswrapper[4727]: I1001 12:55:57.816909 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9bzf\" (UniqueName: \"kubernetes.io/projected/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-kube-api-access-p9bzf\") pod \"dnsmasq-dns-59cf4bdb65-lzvjj\" (UID: \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\") " pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" Oct 01 12:55:57 crc kubenswrapper[4727]: I1001 12:55:57.924877 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" Oct 01 12:55:58 crc kubenswrapper[4727]: I1001 12:55:58.441159 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-lzvjj"] Oct 01 12:55:58 crc kubenswrapper[4727]: W1001 12:55:58.445395 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d7e9ecd_3bb9_4f1a_a8f5_eb31340566ff.slice/crio-807f1b96b17d8999cf70fc05d8885f7b38bbd3fe0dd7d4f12b8ec25e7f15803d WatchSource:0}: Error finding container 807f1b96b17d8999cf70fc05d8885f7b38bbd3fe0dd7d4f12b8ec25e7f15803d: Status 404 returned error can't find the container with id 807f1b96b17d8999cf70fc05d8885f7b38bbd3fe0dd7d4f12b8ec25e7f15803d Oct 01 12:55:59 crc kubenswrapper[4727]: I1001 12:55:59.055980 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:55:59 crc kubenswrapper[4727]: I1001 12:55:59.433393 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:55:59 crc kubenswrapper[4727]: I1001 12:55:59.433740 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="27b755da-d064-4481-b856-4b51bb15cecb" containerName="ceilometer-central-agent" containerID="cri-o://c4ffa85c04b17609481b4ceab6673b2e6df719e0fdfedd586410b36416c5e0cc" gracePeriod=30 Oct 01 12:55:59 crc kubenswrapper[4727]: I1001 12:55:59.433884 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="27b755da-d064-4481-b856-4b51bb15cecb" containerName="ceilometer-notification-agent" containerID="cri-o://184d11bd35d7572dbf356a155a2801c9a8efcfee8103e175b674d63c5a20b094" gracePeriod=30 Oct 01 12:55:59 crc kubenswrapper[4727]: I1001 12:55:59.433928 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="27b755da-d064-4481-b856-4b51bb15cecb" containerName="proxy-httpd" containerID="cri-o://e135f691e09488d4250bfad3220af7049a0e6f679595beabf482a08dd683b6ca" gracePeriod=30 Oct 01 12:55:59 crc kubenswrapper[4727]: I1001 12:55:59.433940 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="27b755da-d064-4481-b856-4b51bb15cecb" containerName="sg-core" containerID="cri-o://c16e2f9b8424ed52750b02ed699a36ac6a57f4a0415d76bbc27179cdc0701b06" gracePeriod=30 Oct 01 12:55:59 crc kubenswrapper[4727]: I1001 12:55:59.458362 4727 generic.go:334] "Generic (PLEG): container finished" podID="3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff" containerID="2a3c96ded502cb081fa8f39f3cbc04384e408297e9e2b596eaa95b0dfa5ab701" exitCode=0 Oct 01 12:55:59 crc kubenswrapper[4727]: I1001 12:55:59.460031 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" event={"ID":"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff","Type":"ContainerDied","Data":"2a3c96ded502cb081fa8f39f3cbc04384e408297e9e2b596eaa95b0dfa5ab701"} Oct 01 12:55:59 crc kubenswrapper[4727]: I1001 12:55:59.460066 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" event={"ID":"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff","Type":"ContainerStarted","Data":"807f1b96b17d8999cf70fc05d8885f7b38bbd3fe0dd7d4f12b8ec25e7f15803d"} Oct 01 12:56:00 crc kubenswrapper[4727]: I1001 12:56:00.479589 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:00 crc kubenswrapper[4727]: I1001 12:56:00.497126 4727 generic.go:334] "Generic (PLEG): container finished" podID="27b755da-d064-4481-b856-4b51bb15cecb" containerID="e135f691e09488d4250bfad3220af7049a0e6f679595beabf482a08dd683b6ca" exitCode=0 Oct 01 12:56:00 crc kubenswrapper[4727]: I1001 12:56:00.497186 4727 generic.go:334] "Generic (PLEG): container finished" podID="27b755da-d064-4481-b856-4b51bb15cecb" containerID="c16e2f9b8424ed52750b02ed699a36ac6a57f4a0415d76bbc27179cdc0701b06" exitCode=2 Oct 01 12:56:00 crc kubenswrapper[4727]: I1001 12:56:00.497195 4727 generic.go:334] "Generic (PLEG): container finished" podID="27b755da-d064-4481-b856-4b51bb15cecb" containerID="c4ffa85c04b17609481b4ceab6673b2e6df719e0fdfedd586410b36416c5e0cc" exitCode=0 Oct 01 12:56:00 crc kubenswrapper[4727]: I1001 12:56:00.497328 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27b755da-d064-4481-b856-4b51bb15cecb","Type":"ContainerDied","Data":"e135f691e09488d4250bfad3220af7049a0e6f679595beabf482a08dd683b6ca"} Oct 01 12:56:00 crc kubenswrapper[4727]: I1001 12:56:00.497373 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27b755da-d064-4481-b856-4b51bb15cecb","Type":"ContainerDied","Data":"c16e2f9b8424ed52750b02ed699a36ac6a57f4a0415d76bbc27179cdc0701b06"} Oct 01 12:56:00 crc kubenswrapper[4727]: I1001 12:56:00.497386 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27b755da-d064-4481-b856-4b51bb15cecb","Type":"ContainerDied","Data":"c4ffa85c04b17609481b4ceab6673b2e6df719e0fdfedd586410b36416c5e0cc"} Oct 01 12:56:00 crc kubenswrapper[4727]: I1001 12:56:00.502625 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" event={"ID":"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff","Type":"ContainerStarted","Data":"133a818030e2ad157b3e9dbd8387fb3a35dc671c77977f2c36914c940d21f7ce"} Oct 01 12:56:00 crc kubenswrapper[4727]: I1001 12:56:00.502652 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d8db6f96-22e0-4d30-a612-29549be3b024" containerName="nova-api-log" containerID="cri-o://13ba751d270f384802727666a281784a1835d3d638074590f17bd41665faefe4" gracePeriod=30 Oct 01 12:56:00 crc kubenswrapper[4727]: I1001 12:56:00.503320 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d8db6f96-22e0-4d30-a612-29549be3b024" containerName="nova-api-api" containerID="cri-o://bde41fbb3ecfc787e26d13c540953fbcd3ce1e32aa7befbee54f1c0e073eeaa1" gracePeriod=30 Oct 01 12:56:00 crc kubenswrapper[4727]: I1001 12:56:00.542279 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" podStartSLOduration=3.5422430560000002 podStartE2EDuration="3.542243056s" podCreationTimestamp="2025-10-01 12:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:56:00.531522 +0000 UTC m=+1138.852876837" watchObservedRunningTime="2025-10-01 12:56:00.542243056 +0000 UTC m=+1138.863597893" Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.515595 4727 generic.go:334] "Generic (PLEG): container finished" podID="d8db6f96-22e0-4d30-a612-29549be3b024" containerID="13ba751d270f384802727666a281784a1835d3d638074590f17bd41665faefe4" exitCode=143 Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.515661 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8db6f96-22e0-4d30-a612-29549be3b024","Type":"ContainerDied","Data":"13ba751d270f384802727666a281784a1835d3d638074590f17bd41665faefe4"} Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.519833 4727 generic.go:334] "Generic (PLEG): container finished" podID="27b755da-d064-4481-b856-4b51bb15cecb" containerID="184d11bd35d7572dbf356a155a2801c9a8efcfee8103e175b674d63c5a20b094" exitCode=0 Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.520741 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27b755da-d064-4481-b856-4b51bb15cecb","Type":"ContainerDied","Data":"184d11bd35d7572dbf356a155a2801c9a8efcfee8103e175b674d63c5a20b094"} Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.520917 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.617062 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.764466 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b755da-d064-4481-b856-4b51bb15cecb-config-data\") pod \"27b755da-d064-4481-b856-4b51bb15cecb\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.764801 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twkgw\" (UniqueName: \"kubernetes.io/projected/27b755da-d064-4481-b856-4b51bb15cecb-kube-api-access-twkgw\") pod \"27b755da-d064-4481-b856-4b51bb15cecb\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.764825 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27b755da-d064-4481-b856-4b51bb15cecb-scripts\") pod \"27b755da-d064-4481-b856-4b51bb15cecb\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.764847 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27b755da-d064-4481-b856-4b51bb15cecb-run-httpd\") pod \"27b755da-d064-4481-b856-4b51bb15cecb\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.764921 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27b755da-d064-4481-b856-4b51bb15cecb-log-httpd\") pod \"27b755da-d064-4481-b856-4b51bb15cecb\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.765010 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27b755da-d064-4481-b856-4b51bb15cecb-sg-core-conf-yaml\") pod \"27b755da-d064-4481-b856-4b51bb15cecb\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.765053 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b755da-d064-4481-b856-4b51bb15cecb-combined-ca-bundle\") pod \"27b755da-d064-4481-b856-4b51bb15cecb\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.766552 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27b755da-d064-4481-b856-4b51bb15cecb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "27b755da-d064-4481-b856-4b51bb15cecb" (UID: "27b755da-d064-4481-b856-4b51bb15cecb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.766726 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27b755da-d064-4481-b856-4b51bb15cecb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "27b755da-d064-4481-b856-4b51bb15cecb" (UID: "27b755da-d064-4481-b856-4b51bb15cecb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.771010 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27b755da-d064-4481-b856-4b51bb15cecb-kube-api-access-twkgw" (OuterVolumeSpecName: "kube-api-access-twkgw") pod "27b755da-d064-4481-b856-4b51bb15cecb" (UID: "27b755da-d064-4481-b856-4b51bb15cecb"). InnerVolumeSpecName "kube-api-access-twkgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.792374 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27b755da-d064-4481-b856-4b51bb15cecb-scripts" (OuterVolumeSpecName: "scripts") pod "27b755da-d064-4481-b856-4b51bb15cecb" (UID: "27b755da-d064-4481-b856-4b51bb15cecb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.797217 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27b755da-d064-4481-b856-4b51bb15cecb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "27b755da-d064-4481-b856-4b51bb15cecb" (UID: "27b755da-d064-4481-b856-4b51bb15cecb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.849620 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27b755da-d064-4481-b856-4b51bb15cecb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27b755da-d064-4481-b856-4b51bb15cecb" (UID: "27b755da-d064-4481-b856-4b51bb15cecb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.872359 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27b755da-d064-4481-b856-4b51bb15cecb-config-data" (OuterVolumeSpecName: "config-data") pod "27b755da-d064-4481-b856-4b51bb15cecb" (UID: "27b755da-d064-4481-b856-4b51bb15cecb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.878273 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b755da-d064-4481-b856-4b51bb15cecb-config-data\") pod \"27b755da-d064-4481-b856-4b51bb15cecb\" (UID: \"27b755da-d064-4481-b856-4b51bb15cecb\") " Oct 01 12:56:01 crc kubenswrapper[4727]: W1001 12:56:01.878395 4727 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/27b755da-d064-4481-b856-4b51bb15cecb/volumes/kubernetes.io~secret/config-data Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.878489 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27b755da-d064-4481-b856-4b51bb15cecb-config-data" (OuterVolumeSpecName: "config-data") pod "27b755da-d064-4481-b856-4b51bb15cecb" (UID: "27b755da-d064-4481-b856-4b51bb15cecb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.879134 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b755da-d064-4481-b856-4b51bb15cecb-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.879154 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twkgw\" (UniqueName: \"kubernetes.io/projected/27b755da-d064-4481-b856-4b51bb15cecb-kube-api-access-twkgw\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.879167 4727 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27b755da-d064-4481-b856-4b51bb15cecb-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.879175 4727 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27b755da-d064-4481-b856-4b51bb15cecb-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.879183 4727 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27b755da-d064-4481-b856-4b51bb15cecb-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.879191 4727 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27b755da-d064-4481-b856-4b51bb15cecb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:01 crc kubenswrapper[4727]: I1001 12:56:01.879198 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b755da-d064-4481-b856-4b51bb15cecb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.530546 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27b755da-d064-4481-b856-4b51bb15cecb","Type":"ContainerDied","Data":"ab26382c2d1a3ca68c36b50aeb8656829916d467a5c3db235041143e82fda2c6"} Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.530577 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.530598 4727 scope.go:117] "RemoveContainer" containerID="e135f691e09488d4250bfad3220af7049a0e6f679595beabf482a08dd683b6ca" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.551411 4727 scope.go:117] "RemoveContainer" containerID="c16e2f9b8424ed52750b02ed699a36ac6a57f4a0415d76bbc27179cdc0701b06" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.567026 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.577285 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.586327 4727 scope.go:117] "RemoveContainer" containerID="184d11bd35d7572dbf356a155a2801c9a8efcfee8103e175b674d63c5a20b094" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.608449 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:02 crc kubenswrapper[4727]: E1001 12:56:02.608848 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27b755da-d064-4481-b856-4b51bb15cecb" containerName="ceilometer-notification-agent" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.608860 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="27b755da-d064-4481-b856-4b51bb15cecb" containerName="ceilometer-notification-agent" Oct 01 12:56:02 crc kubenswrapper[4727]: E1001 12:56:02.608879 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27b755da-d064-4481-b856-4b51bb15cecb" containerName="sg-core" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.608885 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="27b755da-d064-4481-b856-4b51bb15cecb" containerName="sg-core" Oct 01 12:56:02 crc kubenswrapper[4727]: E1001 12:56:02.608902 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27b755da-d064-4481-b856-4b51bb15cecb" containerName="proxy-httpd" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.608909 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="27b755da-d064-4481-b856-4b51bb15cecb" containerName="proxy-httpd" Oct 01 12:56:02 crc kubenswrapper[4727]: E1001 12:56:02.608921 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27b755da-d064-4481-b856-4b51bb15cecb" containerName="ceilometer-central-agent" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.608927 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="27b755da-d064-4481-b856-4b51bb15cecb" containerName="ceilometer-central-agent" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.609106 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="27b755da-d064-4481-b856-4b51bb15cecb" containerName="ceilometer-central-agent" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.609130 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="27b755da-d064-4481-b856-4b51bb15cecb" containerName="sg-core" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.609143 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="27b755da-d064-4481-b856-4b51bb15cecb" containerName="proxy-httpd" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.609160 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="27b755da-d064-4481-b856-4b51bb15cecb" containerName="ceilometer-notification-agent" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.610848 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.614169 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.614448 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.621993 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.635576 4727 scope.go:117] "RemoveContainer" containerID="c4ffa85c04b17609481b4ceab6673b2e6df719e0fdfedd586410b36416c5e0cc" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.695278 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a96edf-502b-4528-a73b-b7fb945d3d80-config-data\") pod \"ceilometer-0\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " pod="openstack/ceilometer-0" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.695331 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a96edf-502b-4528-a73b-b7fb945d3d80-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " pod="openstack/ceilometer-0" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.695612 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92a96edf-502b-4528-a73b-b7fb945d3d80-log-httpd\") pod \"ceilometer-0\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " pod="openstack/ceilometer-0" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.695687 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92a96edf-502b-4528-a73b-b7fb945d3d80-run-httpd\") pod \"ceilometer-0\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " pod="openstack/ceilometer-0" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.695749 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92a96edf-502b-4528-a73b-b7fb945d3d80-scripts\") pod \"ceilometer-0\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " pod="openstack/ceilometer-0" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.695771 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njr8m\" (UniqueName: \"kubernetes.io/projected/92a96edf-502b-4528-a73b-b7fb945d3d80-kube-api-access-njr8m\") pod \"ceilometer-0\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " pod="openstack/ceilometer-0" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.695934 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92a96edf-502b-4528-a73b-b7fb945d3d80-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " pod="openstack/ceilometer-0" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.797532 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92a96edf-502b-4528-a73b-b7fb945d3d80-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " pod="openstack/ceilometer-0" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.797620 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a96edf-502b-4528-a73b-b7fb945d3d80-config-data\") pod \"ceilometer-0\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " pod="openstack/ceilometer-0" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.797665 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a96edf-502b-4528-a73b-b7fb945d3d80-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " pod="openstack/ceilometer-0" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.797753 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92a96edf-502b-4528-a73b-b7fb945d3d80-log-httpd\") pod \"ceilometer-0\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " pod="openstack/ceilometer-0" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.797787 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92a96edf-502b-4528-a73b-b7fb945d3d80-run-httpd\") pod \"ceilometer-0\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " pod="openstack/ceilometer-0" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.797824 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92a96edf-502b-4528-a73b-b7fb945d3d80-scripts\") pod \"ceilometer-0\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " pod="openstack/ceilometer-0" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.797843 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njr8m\" (UniqueName: \"kubernetes.io/projected/92a96edf-502b-4528-a73b-b7fb945d3d80-kube-api-access-njr8m\") pod \"ceilometer-0\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " pod="openstack/ceilometer-0" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.798297 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92a96edf-502b-4528-a73b-b7fb945d3d80-log-httpd\") pod \"ceilometer-0\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " pod="openstack/ceilometer-0" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.798975 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92a96edf-502b-4528-a73b-b7fb945d3d80-run-httpd\") pod \"ceilometer-0\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " pod="openstack/ceilometer-0" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.803787 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92a96edf-502b-4528-a73b-b7fb945d3d80-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " pod="openstack/ceilometer-0" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.803951 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a96edf-502b-4528-a73b-b7fb945d3d80-config-data\") pod \"ceilometer-0\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " pod="openstack/ceilometer-0" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.805532 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a96edf-502b-4528-a73b-b7fb945d3d80-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " pod="openstack/ceilometer-0" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.812840 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92a96edf-502b-4528-a73b-b7fb945d3d80-scripts\") pod \"ceilometer-0\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " pod="openstack/ceilometer-0" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.815920 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njr8m\" (UniqueName: \"kubernetes.io/projected/92a96edf-502b-4528-a73b-b7fb945d3d80-kube-api-access-njr8m\") pod \"ceilometer-0\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " pod="openstack/ceilometer-0" Oct 01 12:56:02 crc kubenswrapper[4727]: I1001 12:56:02.931421 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:56:03 crc kubenswrapper[4727]: I1001 12:56:03.292408 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:56:03 crc kubenswrapper[4727]: I1001 12:56:03.292660 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:56:03 crc kubenswrapper[4727]: I1001 12:56:03.391814 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:03 crc kubenswrapper[4727]: W1001 12:56:03.395155 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92a96edf_502b_4528_a73b_b7fb945d3d80.slice/crio-ec47f7cff295dc14ecbfdb166ee4877c535bb3e1587f840723f3dbe7f8362b73 WatchSource:0}: Error finding container ec47f7cff295dc14ecbfdb166ee4877c535bb3e1587f840723f3dbe7f8362b73: Status 404 returned error can't find the container with id ec47f7cff295dc14ecbfdb166ee4877c535bb3e1587f840723f3dbe7f8362b73 Oct 01 12:56:03 crc kubenswrapper[4727]: I1001 12:56:03.541261 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92a96edf-502b-4528-a73b-b7fb945d3d80","Type":"ContainerStarted","Data":"ec47f7cff295dc14ecbfdb166ee4877c535bb3e1587f840723f3dbe7f8362b73"} Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.056548 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.076414 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.112776 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.122715 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8db6f96-22e0-4d30-a612-29549be3b024-combined-ca-bundle\") pod \"d8db6f96-22e0-4d30-a612-29549be3b024\" (UID: \"d8db6f96-22e0-4d30-a612-29549be3b024\") " Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.123127 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5275g\" (UniqueName: \"kubernetes.io/projected/d8db6f96-22e0-4d30-a612-29549be3b024-kube-api-access-5275g\") pod \"d8db6f96-22e0-4d30-a612-29549be3b024\" (UID: \"d8db6f96-22e0-4d30-a612-29549be3b024\") " Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.123253 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8db6f96-22e0-4d30-a612-29549be3b024-logs\") pod \"d8db6f96-22e0-4d30-a612-29549be3b024\" (UID: \"d8db6f96-22e0-4d30-a612-29549be3b024\") " Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.123449 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8db6f96-22e0-4d30-a612-29549be3b024-config-data\") pod \"d8db6f96-22e0-4d30-a612-29549be3b024\" (UID: \"d8db6f96-22e0-4d30-a612-29549be3b024\") " Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.125612 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8db6f96-22e0-4d30-a612-29549be3b024-logs" (OuterVolumeSpecName: "logs") pod "d8db6f96-22e0-4d30-a612-29549be3b024" (UID: "d8db6f96-22e0-4d30-a612-29549be3b024"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.128868 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8db6f96-22e0-4d30-a612-29549be3b024-kube-api-access-5275g" (OuterVolumeSpecName: "kube-api-access-5275g") pod "d8db6f96-22e0-4d30-a612-29549be3b024" (UID: "d8db6f96-22e0-4d30-a612-29549be3b024"). InnerVolumeSpecName "kube-api-access-5275g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.161060 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8db6f96-22e0-4d30-a612-29549be3b024-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8db6f96-22e0-4d30-a612-29549be3b024" (UID: "d8db6f96-22e0-4d30-a612-29549be3b024"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.176251 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8db6f96-22e0-4d30-a612-29549be3b024-config-data" (OuterVolumeSpecName: "config-data") pod "d8db6f96-22e0-4d30-a612-29549be3b024" (UID: "d8db6f96-22e0-4d30-a612-29549be3b024"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.232628 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5275g\" (UniqueName: \"kubernetes.io/projected/d8db6f96-22e0-4d30-a612-29549be3b024-kube-api-access-5275g\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.232947 4727 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8db6f96-22e0-4d30-a612-29549be3b024-logs\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.232959 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8db6f96-22e0-4d30-a612-29549be3b024-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.232975 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8db6f96-22e0-4d30-a612-29549be3b024-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.387213 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27b755da-d064-4481-b856-4b51bb15cecb" path="/var/lib/kubelet/pods/27b755da-d064-4481-b856-4b51bb15cecb/volumes" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.552840 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92a96edf-502b-4528-a73b-b7fb945d3d80","Type":"ContainerStarted","Data":"8a069a2b225cadd85feb1b4208eb9d85eb44cab68c20fedd3f32a9efc2e1577d"} Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.554631 4727 generic.go:334] "Generic (PLEG): container finished" podID="d8db6f96-22e0-4d30-a612-29549be3b024" containerID="bde41fbb3ecfc787e26d13c540953fbcd3ce1e32aa7befbee54f1c0e073eeaa1" exitCode=0 Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.554666 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8db6f96-22e0-4d30-a612-29549be3b024","Type":"ContainerDied","Data":"bde41fbb3ecfc787e26d13c540953fbcd3ce1e32aa7befbee54f1c0e073eeaa1"} Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.554705 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8db6f96-22e0-4d30-a612-29549be3b024","Type":"ContainerDied","Data":"b315dfc3e98da863512bd37962e837b543d7829bb683335dfa70d2cd55f31e83"} Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.554725 4727 scope.go:117] "RemoveContainer" containerID="bde41fbb3ecfc787e26d13c540953fbcd3ce1e32aa7befbee54f1c0e073eeaa1" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.554779 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.581291 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.585548 4727 scope.go:117] "RemoveContainer" containerID="13ba751d270f384802727666a281784a1835d3d638074590f17bd41665faefe4" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.586053 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.593780 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.610920 4727 scope.go:117] "RemoveContainer" containerID="bde41fbb3ecfc787e26d13c540953fbcd3ce1e32aa7befbee54f1c0e073eeaa1" Oct 01 12:56:04 crc kubenswrapper[4727]: E1001 12:56:04.611365 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bde41fbb3ecfc787e26d13c540953fbcd3ce1e32aa7befbee54f1c0e073eeaa1\": container with ID starting with bde41fbb3ecfc787e26d13c540953fbcd3ce1e32aa7befbee54f1c0e073eeaa1 not found: ID does not exist" containerID="bde41fbb3ecfc787e26d13c540953fbcd3ce1e32aa7befbee54f1c0e073eeaa1" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.611393 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde41fbb3ecfc787e26d13c540953fbcd3ce1e32aa7befbee54f1c0e073eeaa1"} err="failed to get container status \"bde41fbb3ecfc787e26d13c540953fbcd3ce1e32aa7befbee54f1c0e073eeaa1\": rpc error: code = NotFound desc = could not find container \"bde41fbb3ecfc787e26d13c540953fbcd3ce1e32aa7befbee54f1c0e073eeaa1\": container with ID starting with bde41fbb3ecfc787e26d13c540953fbcd3ce1e32aa7befbee54f1c0e073eeaa1 not found: ID does not exist" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.611415 4727 scope.go:117] "RemoveContainer" containerID="13ba751d270f384802727666a281784a1835d3d638074590f17bd41665faefe4" Oct 01 12:56:04 crc kubenswrapper[4727]: E1001 12:56:04.611786 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13ba751d270f384802727666a281784a1835d3d638074590f17bd41665faefe4\": container with ID starting with 13ba751d270f384802727666a281784a1835d3d638074590f17bd41665faefe4 not found: ID does not exist" containerID="13ba751d270f384802727666a281784a1835d3d638074590f17bd41665faefe4" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.611900 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13ba751d270f384802727666a281784a1835d3d638074590f17bd41665faefe4"} err="failed to get container status \"13ba751d270f384802727666a281784a1835d3d638074590f17bd41665faefe4\": rpc error: code = NotFound desc = could not find container \"13ba751d270f384802727666a281784a1835d3d638074590f17bd41665faefe4\": container with ID starting with 13ba751d270f384802727666a281784a1835d3d638074590f17bd41665faefe4 not found: ID does not exist" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.612564 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:04 crc kubenswrapper[4727]: E1001 12:56:04.613122 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8db6f96-22e0-4d30-a612-29549be3b024" containerName="nova-api-log" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.613147 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8db6f96-22e0-4d30-a612-29549be3b024" containerName="nova-api-log" Oct 01 12:56:04 crc kubenswrapper[4727]: E1001 12:56:04.613171 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8db6f96-22e0-4d30-a612-29549be3b024" containerName="nova-api-api" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.613181 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8db6f96-22e0-4d30-a612-29549be3b024" containerName="nova-api-api" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.613454 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8db6f96-22e0-4d30-a612-29549be3b024" containerName="nova-api-api" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.613505 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8db6f96-22e0-4d30-a612-29549be3b024" containerName="nova-api-log" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.616610 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.619779 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.621103 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.624256 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.635940 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.643091 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/428cf4e3-e21c-44ab-a562-4aee1e36956c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"428cf4e3-e21c-44ab-a562-4aee1e36956c\") " pod="openstack/nova-api-0" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.643144 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/428cf4e3-e21c-44ab-a562-4aee1e36956c-config-data\") pod \"nova-api-0\" (UID: \"428cf4e3-e21c-44ab-a562-4aee1e36956c\") " pod="openstack/nova-api-0" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.643190 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glmqf\" (UniqueName: \"kubernetes.io/projected/428cf4e3-e21c-44ab-a562-4aee1e36956c-kube-api-access-glmqf\") pod \"nova-api-0\" (UID: \"428cf4e3-e21c-44ab-a562-4aee1e36956c\") " pod="openstack/nova-api-0" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.643257 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/428cf4e3-e21c-44ab-a562-4aee1e36956c-logs\") pod \"nova-api-0\" (UID: \"428cf4e3-e21c-44ab-a562-4aee1e36956c\") " pod="openstack/nova-api-0" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.643284 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/428cf4e3-e21c-44ab-a562-4aee1e36956c-public-tls-certs\") pod \"nova-api-0\" (UID: \"428cf4e3-e21c-44ab-a562-4aee1e36956c\") " pod="openstack/nova-api-0" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.643452 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/428cf4e3-e21c-44ab-a562-4aee1e36956c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"428cf4e3-e21c-44ab-a562-4aee1e36956c\") " pod="openstack/nova-api-0" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.745263 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/428cf4e3-e21c-44ab-a562-4aee1e36956c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"428cf4e3-e21c-44ab-a562-4aee1e36956c\") " pod="openstack/nova-api-0" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.745323 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/428cf4e3-e21c-44ab-a562-4aee1e36956c-config-data\") pod \"nova-api-0\" (UID: \"428cf4e3-e21c-44ab-a562-4aee1e36956c\") " pod="openstack/nova-api-0" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.745366 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glmqf\" (UniqueName: \"kubernetes.io/projected/428cf4e3-e21c-44ab-a562-4aee1e36956c-kube-api-access-glmqf\") pod \"nova-api-0\" (UID: \"428cf4e3-e21c-44ab-a562-4aee1e36956c\") " pod="openstack/nova-api-0" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.745400 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/428cf4e3-e21c-44ab-a562-4aee1e36956c-logs\") pod \"nova-api-0\" (UID: \"428cf4e3-e21c-44ab-a562-4aee1e36956c\") " pod="openstack/nova-api-0" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.745422 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/428cf4e3-e21c-44ab-a562-4aee1e36956c-public-tls-certs\") pod \"nova-api-0\" (UID: \"428cf4e3-e21c-44ab-a562-4aee1e36956c\") " pod="openstack/nova-api-0" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.745528 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/428cf4e3-e21c-44ab-a562-4aee1e36956c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"428cf4e3-e21c-44ab-a562-4aee1e36956c\") " pod="openstack/nova-api-0" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.752978 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/428cf4e3-e21c-44ab-a562-4aee1e36956c-logs\") pod \"nova-api-0\" (UID: \"428cf4e3-e21c-44ab-a562-4aee1e36956c\") " pod="openstack/nova-api-0" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.754630 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/428cf4e3-e21c-44ab-a562-4aee1e36956c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"428cf4e3-e21c-44ab-a562-4aee1e36956c\") " pod="openstack/nova-api-0" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.754705 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/428cf4e3-e21c-44ab-a562-4aee1e36956c-config-data\") pod \"nova-api-0\" (UID: \"428cf4e3-e21c-44ab-a562-4aee1e36956c\") " pod="openstack/nova-api-0" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.756245 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/428cf4e3-e21c-44ab-a562-4aee1e36956c-public-tls-certs\") pod \"nova-api-0\" (UID: \"428cf4e3-e21c-44ab-a562-4aee1e36956c\") " pod="openstack/nova-api-0" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.758849 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/428cf4e3-e21c-44ab-a562-4aee1e36956c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"428cf4e3-e21c-44ab-a562-4aee1e36956c\") " pod="openstack/nova-api-0" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.768948 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glmqf\" (UniqueName: \"kubernetes.io/projected/428cf4e3-e21c-44ab-a562-4aee1e36956c-kube-api-access-glmqf\") pod \"nova-api-0\" (UID: \"428cf4e3-e21c-44ab-a562-4aee1e36956c\") " pod="openstack/nova-api-0" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.862644 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-rpkr4"] Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.864124 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rpkr4" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.868968 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.869301 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.898081 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-rpkr4"] Oct 01 12:56:04 crc kubenswrapper[4727]: I1001 12:56:04.948172 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:56:05 crc kubenswrapper[4727]: I1001 12:56:05.051648 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc24fafb-1942-435d-8dd4-412ef1b4ebd6-scripts\") pod \"nova-cell1-cell-mapping-rpkr4\" (UID: \"cc24fafb-1942-435d-8dd4-412ef1b4ebd6\") " pod="openstack/nova-cell1-cell-mapping-rpkr4" Oct 01 12:56:05 crc kubenswrapper[4727]: I1001 12:56:05.051711 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc24fafb-1942-435d-8dd4-412ef1b4ebd6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-rpkr4\" (UID: \"cc24fafb-1942-435d-8dd4-412ef1b4ebd6\") " pod="openstack/nova-cell1-cell-mapping-rpkr4" Oct 01 12:56:05 crc kubenswrapper[4727]: I1001 12:56:05.051790 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc24fafb-1942-435d-8dd4-412ef1b4ebd6-config-data\") pod \"nova-cell1-cell-mapping-rpkr4\" (UID: \"cc24fafb-1942-435d-8dd4-412ef1b4ebd6\") " pod="openstack/nova-cell1-cell-mapping-rpkr4" Oct 01 12:56:05 crc kubenswrapper[4727]: I1001 12:56:05.051878 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42tt7\" (UniqueName: \"kubernetes.io/projected/cc24fafb-1942-435d-8dd4-412ef1b4ebd6-kube-api-access-42tt7\") pod \"nova-cell1-cell-mapping-rpkr4\" (UID: \"cc24fafb-1942-435d-8dd4-412ef1b4ebd6\") " pod="openstack/nova-cell1-cell-mapping-rpkr4" Oct 01 12:56:05 crc kubenswrapper[4727]: I1001 12:56:05.153423 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc24fafb-1942-435d-8dd4-412ef1b4ebd6-scripts\") pod \"nova-cell1-cell-mapping-rpkr4\" (UID: \"cc24fafb-1942-435d-8dd4-412ef1b4ebd6\") " pod="openstack/nova-cell1-cell-mapping-rpkr4" Oct 01 12:56:05 crc kubenswrapper[4727]: I1001 12:56:05.153719 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc24fafb-1942-435d-8dd4-412ef1b4ebd6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-rpkr4\" (UID: \"cc24fafb-1942-435d-8dd4-412ef1b4ebd6\") " pod="openstack/nova-cell1-cell-mapping-rpkr4" Oct 01 12:56:05 crc kubenswrapper[4727]: I1001 12:56:05.153762 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc24fafb-1942-435d-8dd4-412ef1b4ebd6-config-data\") pod \"nova-cell1-cell-mapping-rpkr4\" (UID: \"cc24fafb-1942-435d-8dd4-412ef1b4ebd6\") " pod="openstack/nova-cell1-cell-mapping-rpkr4" Oct 01 12:56:05 crc kubenswrapper[4727]: I1001 12:56:05.153818 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42tt7\" (UniqueName: \"kubernetes.io/projected/cc24fafb-1942-435d-8dd4-412ef1b4ebd6-kube-api-access-42tt7\") pod \"nova-cell1-cell-mapping-rpkr4\" (UID: \"cc24fafb-1942-435d-8dd4-412ef1b4ebd6\") " pod="openstack/nova-cell1-cell-mapping-rpkr4" Oct 01 12:56:05 crc kubenswrapper[4727]: I1001 12:56:05.160895 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc24fafb-1942-435d-8dd4-412ef1b4ebd6-config-data\") pod \"nova-cell1-cell-mapping-rpkr4\" (UID: \"cc24fafb-1942-435d-8dd4-412ef1b4ebd6\") " pod="openstack/nova-cell1-cell-mapping-rpkr4" Oct 01 12:56:05 crc kubenswrapper[4727]: I1001 12:56:05.161511 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc24fafb-1942-435d-8dd4-412ef1b4ebd6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-rpkr4\" (UID: \"cc24fafb-1942-435d-8dd4-412ef1b4ebd6\") " pod="openstack/nova-cell1-cell-mapping-rpkr4" Oct 01 12:56:05 crc kubenswrapper[4727]: I1001 12:56:05.162289 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc24fafb-1942-435d-8dd4-412ef1b4ebd6-scripts\") pod \"nova-cell1-cell-mapping-rpkr4\" (UID: \"cc24fafb-1942-435d-8dd4-412ef1b4ebd6\") " pod="openstack/nova-cell1-cell-mapping-rpkr4" Oct 01 12:56:05 crc kubenswrapper[4727]: I1001 12:56:05.181790 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42tt7\" (UniqueName: \"kubernetes.io/projected/cc24fafb-1942-435d-8dd4-412ef1b4ebd6-kube-api-access-42tt7\") pod \"nova-cell1-cell-mapping-rpkr4\" (UID: \"cc24fafb-1942-435d-8dd4-412ef1b4ebd6\") " pod="openstack/nova-cell1-cell-mapping-rpkr4" Oct 01 12:56:05 crc kubenswrapper[4727]: I1001 12:56:05.260044 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rpkr4" Oct 01 12:56:05 crc kubenswrapper[4727]: I1001 12:56:05.464382 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:05 crc kubenswrapper[4727]: W1001 12:56:05.486840 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod428cf4e3_e21c_44ab_a562_4aee1e36956c.slice/crio-4a607ee7e212cf4cee46481bb180b96dcf5f0cbad9df248f75b0252c69303893 WatchSource:0}: Error finding container 4a607ee7e212cf4cee46481bb180b96dcf5f0cbad9df248f75b0252c69303893: Status 404 returned error can't find the container with id 4a607ee7e212cf4cee46481bb180b96dcf5f0cbad9df248f75b0252c69303893 Oct 01 12:56:05 crc kubenswrapper[4727]: I1001 12:56:05.596737 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92a96edf-502b-4528-a73b-b7fb945d3d80","Type":"ContainerStarted","Data":"ff70d54505331bbe6815204a48ca6d5350a7249e39753e5c312aae6dbfd6fc70"} Oct 01 12:56:05 crc kubenswrapper[4727]: I1001 12:56:05.598836 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"428cf4e3-e21c-44ab-a562-4aee1e36956c","Type":"ContainerStarted","Data":"4a607ee7e212cf4cee46481bb180b96dcf5f0cbad9df248f75b0252c69303893"} Oct 01 12:56:05 crc kubenswrapper[4727]: I1001 12:56:05.761371 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-rpkr4"] Oct 01 12:56:06 crc kubenswrapper[4727]: I1001 12:56:06.385988 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8db6f96-22e0-4d30-a612-29549be3b024" path="/var/lib/kubelet/pods/d8db6f96-22e0-4d30-a612-29549be3b024/volumes" Oct 01 12:56:06 crc kubenswrapper[4727]: I1001 12:56:06.615864 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rpkr4" event={"ID":"cc24fafb-1942-435d-8dd4-412ef1b4ebd6","Type":"ContainerStarted","Data":"90908d5469a3816efabd59e45d236920f8dcdb8ba2084930667fcc11ac99b0b7"} Oct 01 12:56:06 crc kubenswrapper[4727]: I1001 12:56:06.616427 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rpkr4" event={"ID":"cc24fafb-1942-435d-8dd4-412ef1b4ebd6","Type":"ContainerStarted","Data":"b4a291d2b1ab6cdbc1826c3b20fc7d699919228ee77b915387f04d96f601c02d"} Oct 01 12:56:06 crc kubenswrapper[4727]: I1001 12:56:06.622290 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92a96edf-502b-4528-a73b-b7fb945d3d80","Type":"ContainerStarted","Data":"f3539bfedbe166f40542484e56d0b55c37e0660807fd3c31668356168a31d3c7"} Oct 01 12:56:06 crc kubenswrapper[4727]: I1001 12:56:06.626714 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"428cf4e3-e21c-44ab-a562-4aee1e36956c","Type":"ContainerStarted","Data":"565e7e9ad1ec87eeffae7ef0ed1054ed9b7e2365906c4734a90040093a9c1f82"} Oct 01 12:56:06 crc kubenswrapper[4727]: I1001 12:56:06.626777 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"428cf4e3-e21c-44ab-a562-4aee1e36956c","Type":"ContainerStarted","Data":"d664c28112b5cb1d8e53ad5f3d65d60c2384718a9ba732664fdc90a275b15398"} Oct 01 12:56:06 crc kubenswrapper[4727]: I1001 12:56:06.646582 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-rpkr4" podStartSLOduration=2.646561322 podStartE2EDuration="2.646561322s" podCreationTimestamp="2025-10-01 12:56:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:56:06.629882433 +0000 UTC m=+1144.951237260" watchObservedRunningTime="2025-10-01 12:56:06.646561322 +0000 UTC m=+1144.967916179" Oct 01 12:56:06 crc kubenswrapper[4727]: I1001 12:56:06.658477 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.658411559 podStartE2EDuration="2.658411559s" podCreationTimestamp="2025-10-01 12:56:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:56:06.651722825 +0000 UTC m=+1144.973077682" watchObservedRunningTime="2025-10-01 12:56:06.658411559 +0000 UTC m=+1144.979766396" Oct 01 12:56:07 crc kubenswrapper[4727]: I1001 12:56:07.927748 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" Oct 01 12:56:07 crc kubenswrapper[4727]: I1001 12:56:07.987142 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-mlvdt"] Oct 01 12:56:07 crc kubenswrapper[4727]: I1001 12:56:07.987401 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" podUID="a62fbedb-de7c-424a-bdec-92639359a708" containerName="dnsmasq-dns" containerID="cri-o://dedb8d305ee13ccb866e1f6882d20d74107dd9c5fdcf8048d754c24185f844ed" gracePeriod=10 Oct 01 12:56:08 crc kubenswrapper[4727]: I1001 12:56:08.351407 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" podUID="a62fbedb-de7c-424a-bdec-92639359a708" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.186:5353: connect: connection refused" Oct 01 12:56:08 crc kubenswrapper[4727]: I1001 12:56:08.674195 4727 generic.go:334] "Generic (PLEG): container finished" podID="a62fbedb-de7c-424a-bdec-92639359a708" containerID="dedb8d305ee13ccb866e1f6882d20d74107dd9c5fdcf8048d754c24185f844ed" exitCode=0 Oct 01 12:56:08 crc kubenswrapper[4727]: I1001 12:56:08.674310 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" event={"ID":"a62fbedb-de7c-424a-bdec-92639359a708","Type":"ContainerDied","Data":"dedb8d305ee13ccb866e1f6882d20d74107dd9c5fdcf8048d754c24185f844ed"} Oct 01 12:56:08 crc kubenswrapper[4727]: I1001 12:56:08.674358 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" event={"ID":"a62fbedb-de7c-424a-bdec-92639359a708","Type":"ContainerDied","Data":"6dfbaec7e566dc7c292b6881b877229338ecae9ee5047dc1c47e59864e20b7a1"} Oct 01 12:56:08 crc kubenswrapper[4727]: I1001 12:56:08.674381 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dfbaec7e566dc7c292b6881b877229338ecae9ee5047dc1c47e59864e20b7a1" Oct 01 12:56:08 crc kubenswrapper[4727]: I1001 12:56:08.678647 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92a96edf-502b-4528-a73b-b7fb945d3d80","Type":"ContainerStarted","Data":"6c48651d2332668b6e6ba84bdbb4eb8fc2a2a34743da42e5f5649d1c59591b28"} Oct 01 12:56:08 crc kubenswrapper[4727]: I1001 12:56:08.678905 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 12:56:08 crc kubenswrapper[4727]: I1001 12:56:08.719211 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.619143926 podStartE2EDuration="6.719190013s" podCreationTimestamp="2025-10-01 12:56:02 +0000 UTC" firstStartedPulling="2025-10-01 12:56:03.397274327 +0000 UTC m=+1141.718629164" lastFinishedPulling="2025-10-01 12:56:07.497320374 +0000 UTC m=+1145.818675251" observedRunningTime="2025-10-01 12:56:08.712874359 +0000 UTC m=+1147.034229216" watchObservedRunningTime="2025-10-01 12:56:08.719190013 +0000 UTC m=+1147.040544850" Oct 01 12:56:08 crc kubenswrapper[4727]: I1001 12:56:08.731983 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" Oct 01 12:56:08 crc kubenswrapper[4727]: I1001 12:56:08.759652 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-dns-swift-storage-0\") pod \"a62fbedb-de7c-424a-bdec-92639359a708\" (UID: \"a62fbedb-de7c-424a-bdec-92639359a708\") " Oct 01 12:56:08 crc kubenswrapper[4727]: I1001 12:56:08.759775 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-ovsdbserver-nb\") pod \"a62fbedb-de7c-424a-bdec-92639359a708\" (UID: \"a62fbedb-de7c-424a-bdec-92639359a708\") " Oct 01 12:56:08 crc kubenswrapper[4727]: I1001 12:56:08.759863 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-dns-svc\") pod \"a62fbedb-de7c-424a-bdec-92639359a708\" (UID: \"a62fbedb-de7c-424a-bdec-92639359a708\") " Oct 01 12:56:08 crc kubenswrapper[4727]: I1001 12:56:08.759930 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-config\") pod \"a62fbedb-de7c-424a-bdec-92639359a708\" (UID: \"a62fbedb-de7c-424a-bdec-92639359a708\") " Oct 01 12:56:08 crc kubenswrapper[4727]: I1001 12:56:08.759963 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-ovsdbserver-sb\") pod \"a62fbedb-de7c-424a-bdec-92639359a708\" (UID: \"a62fbedb-de7c-424a-bdec-92639359a708\") " Oct 01 12:56:08 crc kubenswrapper[4727]: I1001 12:56:08.760026 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg57b\" (UniqueName: \"kubernetes.io/projected/a62fbedb-de7c-424a-bdec-92639359a708-kube-api-access-kg57b\") pod \"a62fbedb-de7c-424a-bdec-92639359a708\" (UID: \"a62fbedb-de7c-424a-bdec-92639359a708\") " Oct 01 12:56:08 crc kubenswrapper[4727]: I1001 12:56:08.793608 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a62fbedb-de7c-424a-bdec-92639359a708-kube-api-access-kg57b" (OuterVolumeSpecName: "kube-api-access-kg57b") pod "a62fbedb-de7c-424a-bdec-92639359a708" (UID: "a62fbedb-de7c-424a-bdec-92639359a708"). InnerVolumeSpecName "kube-api-access-kg57b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:08 crc kubenswrapper[4727]: I1001 12:56:08.832736 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a62fbedb-de7c-424a-bdec-92639359a708" (UID: "a62fbedb-de7c-424a-bdec-92639359a708"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:56:08 crc kubenswrapper[4727]: I1001 12:56:08.862901 4727 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:08 crc kubenswrapper[4727]: I1001 12:56:08.862946 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg57b\" (UniqueName: \"kubernetes.io/projected/a62fbedb-de7c-424a-bdec-92639359a708-kube-api-access-kg57b\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:08 crc kubenswrapper[4727]: I1001 12:56:08.871221 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a62fbedb-de7c-424a-bdec-92639359a708" (UID: "a62fbedb-de7c-424a-bdec-92639359a708"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:56:08 crc kubenswrapper[4727]: I1001 12:56:08.872886 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-config" (OuterVolumeSpecName: "config") pod "a62fbedb-de7c-424a-bdec-92639359a708" (UID: "a62fbedb-de7c-424a-bdec-92639359a708"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:56:08 crc kubenswrapper[4727]: I1001 12:56:08.882027 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a62fbedb-de7c-424a-bdec-92639359a708" (UID: "a62fbedb-de7c-424a-bdec-92639359a708"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:56:08 crc kubenswrapper[4727]: I1001 12:56:08.906800 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a62fbedb-de7c-424a-bdec-92639359a708" (UID: "a62fbedb-de7c-424a-bdec-92639359a708"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:56:08 crc kubenswrapper[4727]: I1001 12:56:08.964647 4727 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:08 crc kubenswrapper[4727]: I1001 12:56:08.964697 4727 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:08 crc kubenswrapper[4727]: I1001 12:56:08.964712 4727 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:08 crc kubenswrapper[4727]: I1001 12:56:08.964721 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a62fbedb-de7c-424a-bdec-92639359a708-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:09 crc kubenswrapper[4727]: I1001 12:56:09.686703 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-mlvdt" Oct 01 12:56:09 crc kubenswrapper[4727]: I1001 12:56:09.721601 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-mlvdt"] Oct 01 12:56:09 crc kubenswrapper[4727]: I1001 12:56:09.731288 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-mlvdt"] Oct 01 12:56:10 crc kubenswrapper[4727]: I1001 12:56:10.386505 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a62fbedb-de7c-424a-bdec-92639359a708" path="/var/lib/kubelet/pods/a62fbedb-de7c-424a-bdec-92639359a708/volumes" Oct 01 12:56:10 crc kubenswrapper[4727]: E1001 12:56:10.829166 4727 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27b755da_d064_4481_b856_4b51bb15cecb.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27b755da_d064_4481_b856_4b51bb15cecb.slice/crio-ab26382c2d1a3ca68c36b50aeb8656829916d467a5c3db235041143e82fda2c6\": RecentStats: unable to find data in memory cache]" Oct 01 12:56:12 crc kubenswrapper[4727]: I1001 12:56:12.735705 4727 generic.go:334] "Generic (PLEG): container finished" podID="cc24fafb-1942-435d-8dd4-412ef1b4ebd6" containerID="90908d5469a3816efabd59e45d236920f8dcdb8ba2084930667fcc11ac99b0b7" exitCode=0 Oct 01 12:56:12 crc kubenswrapper[4727]: I1001 12:56:12.735784 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rpkr4" event={"ID":"cc24fafb-1942-435d-8dd4-412ef1b4ebd6","Type":"ContainerDied","Data":"90908d5469a3816efabd59e45d236920f8dcdb8ba2084930667fcc11ac99b0b7"} Oct 01 12:56:14 crc kubenswrapper[4727]: I1001 12:56:14.155209 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rpkr4" Oct 01 12:56:14 crc kubenswrapper[4727]: I1001 12:56:14.353904 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc24fafb-1942-435d-8dd4-412ef1b4ebd6-combined-ca-bundle\") pod \"cc24fafb-1942-435d-8dd4-412ef1b4ebd6\" (UID: \"cc24fafb-1942-435d-8dd4-412ef1b4ebd6\") " Oct 01 12:56:14 crc kubenswrapper[4727]: I1001 12:56:14.353977 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc24fafb-1942-435d-8dd4-412ef1b4ebd6-config-data\") pod \"cc24fafb-1942-435d-8dd4-412ef1b4ebd6\" (UID: \"cc24fafb-1942-435d-8dd4-412ef1b4ebd6\") " Oct 01 12:56:14 crc kubenswrapper[4727]: I1001 12:56:14.354105 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc24fafb-1942-435d-8dd4-412ef1b4ebd6-scripts\") pod \"cc24fafb-1942-435d-8dd4-412ef1b4ebd6\" (UID: \"cc24fafb-1942-435d-8dd4-412ef1b4ebd6\") " Oct 01 12:56:14 crc kubenswrapper[4727]: I1001 12:56:14.354302 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42tt7\" (UniqueName: \"kubernetes.io/projected/cc24fafb-1942-435d-8dd4-412ef1b4ebd6-kube-api-access-42tt7\") pod \"cc24fafb-1942-435d-8dd4-412ef1b4ebd6\" (UID: \"cc24fafb-1942-435d-8dd4-412ef1b4ebd6\") " Oct 01 12:56:14 crc kubenswrapper[4727]: I1001 12:56:14.361141 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc24fafb-1942-435d-8dd4-412ef1b4ebd6-scripts" (OuterVolumeSpecName: "scripts") pod "cc24fafb-1942-435d-8dd4-412ef1b4ebd6" (UID: "cc24fafb-1942-435d-8dd4-412ef1b4ebd6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:14 crc kubenswrapper[4727]: I1001 12:56:14.361182 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc24fafb-1942-435d-8dd4-412ef1b4ebd6-kube-api-access-42tt7" (OuterVolumeSpecName: "kube-api-access-42tt7") pod "cc24fafb-1942-435d-8dd4-412ef1b4ebd6" (UID: "cc24fafb-1942-435d-8dd4-412ef1b4ebd6"). InnerVolumeSpecName "kube-api-access-42tt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:14 crc kubenswrapper[4727]: I1001 12:56:14.384514 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc24fafb-1942-435d-8dd4-412ef1b4ebd6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc24fafb-1942-435d-8dd4-412ef1b4ebd6" (UID: "cc24fafb-1942-435d-8dd4-412ef1b4ebd6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:14 crc kubenswrapper[4727]: I1001 12:56:14.403635 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc24fafb-1942-435d-8dd4-412ef1b4ebd6-config-data" (OuterVolumeSpecName: "config-data") pod "cc24fafb-1942-435d-8dd4-412ef1b4ebd6" (UID: "cc24fafb-1942-435d-8dd4-412ef1b4ebd6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:14 crc kubenswrapper[4727]: I1001 12:56:14.457877 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42tt7\" (UniqueName: \"kubernetes.io/projected/cc24fafb-1942-435d-8dd4-412ef1b4ebd6-kube-api-access-42tt7\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:14 crc kubenswrapper[4727]: I1001 12:56:14.457925 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc24fafb-1942-435d-8dd4-412ef1b4ebd6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:14 crc kubenswrapper[4727]: I1001 12:56:14.457936 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc24fafb-1942-435d-8dd4-412ef1b4ebd6-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:14 crc kubenswrapper[4727]: I1001 12:56:14.457947 4727 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc24fafb-1942-435d-8dd4-412ef1b4ebd6-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:14 crc kubenswrapper[4727]: I1001 12:56:14.756399 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rpkr4" event={"ID":"cc24fafb-1942-435d-8dd4-412ef1b4ebd6","Type":"ContainerDied","Data":"b4a291d2b1ab6cdbc1826c3b20fc7d699919228ee77b915387f04d96f601c02d"} Oct 01 12:56:14 crc kubenswrapper[4727]: I1001 12:56:14.756450 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4a291d2b1ab6cdbc1826c3b20fc7d699919228ee77b915387f04d96f601c02d" Oct 01 12:56:14 crc kubenswrapper[4727]: I1001 12:56:14.756529 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rpkr4" Oct 01 12:56:14 crc kubenswrapper[4727]: I1001 12:56:14.940294 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:56:14 crc kubenswrapper[4727]: I1001 12:56:14.940502 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="228ffba7-ed93-43c8-b1df-c9c68c337461" containerName="nova-scheduler-scheduler" containerID="cri-o://1ec5510902c13ae349866ca0c59a0826811487cf418fe9e0fc2f710a6e97e249" gracePeriod=30 Oct 01 12:56:14 crc kubenswrapper[4727]: I1001 12:56:14.949113 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 12:56:14 crc kubenswrapper[4727]: I1001 12:56:14.949165 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 12:56:14 crc kubenswrapper[4727]: I1001 12:56:14.950087 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:14 crc kubenswrapper[4727]: I1001 12:56:14.965317 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:56:14 crc kubenswrapper[4727]: I1001 12:56:14.965583 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c077ba19-0c1d-469a-8614-90ec0aa263ba" containerName="nova-metadata-log" containerID="cri-o://2f7a3c16260c74c057b6dbc39427c0d0c3462479ed2bffb33d9f5ed26df93269" gracePeriod=30 Oct 01 12:56:14 crc kubenswrapper[4727]: I1001 12:56:14.966018 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c077ba19-0c1d-469a-8614-90ec0aa263ba" containerName="nova-metadata-metadata" containerID="cri-o://02290efc5268733d01bfd3357aee6706849eff14f8b80eb9f7f3bbb445774acc" gracePeriod=30 Oct 01 12:56:15 crc kubenswrapper[4727]: E1001 12:56:15.591874 4727 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ec5510902c13ae349866ca0c59a0826811487cf418fe9e0fc2f710a6e97e249" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 12:56:15 crc kubenswrapper[4727]: E1001 12:56:15.594268 4727 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ec5510902c13ae349866ca0c59a0826811487cf418fe9e0fc2f710a6e97e249" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 12:56:15 crc kubenswrapper[4727]: E1001 12:56:15.595535 4727 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ec5510902c13ae349866ca0c59a0826811487cf418fe9e0fc2f710a6e97e249" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 12:56:15 crc kubenswrapper[4727]: E1001 12:56:15.595628 4727 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="228ffba7-ed93-43c8-b1df-c9c68c337461" containerName="nova-scheduler-scheduler" Oct 01 12:56:15 crc kubenswrapper[4727]: I1001 12:56:15.770386 4727 generic.go:334] "Generic (PLEG): container finished" podID="c077ba19-0c1d-469a-8614-90ec0aa263ba" containerID="2f7a3c16260c74c057b6dbc39427c0d0c3462479ed2bffb33d9f5ed26df93269" exitCode=143 Oct 01 12:56:15 crc kubenswrapper[4727]: I1001 12:56:15.770470 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c077ba19-0c1d-469a-8614-90ec0aa263ba","Type":"ContainerDied","Data":"2f7a3c16260c74c057b6dbc39427c0d0c3462479ed2bffb33d9f5ed26df93269"} Oct 01 12:56:15 crc kubenswrapper[4727]: I1001 12:56:15.770967 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="428cf4e3-e21c-44ab-a562-4aee1e36956c" containerName="nova-api-log" containerID="cri-o://d664c28112b5cb1d8e53ad5f3d65d60c2384718a9ba732664fdc90a275b15398" gracePeriod=30 Oct 01 12:56:15 crc kubenswrapper[4727]: I1001 12:56:15.771031 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="428cf4e3-e21c-44ab-a562-4aee1e36956c" containerName="nova-api-api" containerID="cri-o://565e7e9ad1ec87eeffae7ef0ed1054ed9b7e2365906c4734a90040093a9c1f82" gracePeriod=30 Oct 01 12:56:15 crc kubenswrapper[4727]: I1001 12:56:15.775463 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="428cf4e3-e21c-44ab-a562-4aee1e36956c" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.197:8774/\": EOF" Oct 01 12:56:15 crc kubenswrapper[4727]: I1001 12:56:15.780454 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="428cf4e3-e21c-44ab-a562-4aee1e36956c" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.197:8774/\": EOF" Oct 01 12:56:16 crc kubenswrapper[4727]: I1001 12:56:16.781415 4727 generic.go:334] "Generic (PLEG): container finished" podID="428cf4e3-e21c-44ab-a562-4aee1e36956c" containerID="d664c28112b5cb1d8e53ad5f3d65d60c2384718a9ba732664fdc90a275b15398" exitCode=143 Oct 01 12:56:16 crc kubenswrapper[4727]: I1001 12:56:16.781487 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"428cf4e3-e21c-44ab-a562-4aee1e36956c","Type":"ContainerDied","Data":"d664c28112b5cb1d8e53ad5f3d65d60c2384718a9ba732664fdc90a275b15398"} Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.093415 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="c077ba19-0c1d-469a-8614-90ec0aa263ba" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": read tcp 10.217.0.2:36740->10.217.0.191:8775: read: connection reset by peer" Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.093489 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="c077ba19-0c1d-469a-8614-90ec0aa263ba" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": read tcp 10.217.0.2:36746->10.217.0.191:8775: read: connection reset by peer" Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.605680 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.735300 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c077ba19-0c1d-469a-8614-90ec0aa263ba-config-data\") pod \"c077ba19-0c1d-469a-8614-90ec0aa263ba\" (UID: \"c077ba19-0c1d-469a-8614-90ec0aa263ba\") " Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.735443 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmmbz\" (UniqueName: \"kubernetes.io/projected/c077ba19-0c1d-469a-8614-90ec0aa263ba-kube-api-access-kmmbz\") pod \"c077ba19-0c1d-469a-8614-90ec0aa263ba\" (UID: \"c077ba19-0c1d-469a-8614-90ec0aa263ba\") " Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.735502 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c077ba19-0c1d-469a-8614-90ec0aa263ba-nova-metadata-tls-certs\") pod \"c077ba19-0c1d-469a-8614-90ec0aa263ba\" (UID: \"c077ba19-0c1d-469a-8614-90ec0aa263ba\") " Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.735571 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c077ba19-0c1d-469a-8614-90ec0aa263ba-logs\") pod \"c077ba19-0c1d-469a-8614-90ec0aa263ba\" (UID: \"c077ba19-0c1d-469a-8614-90ec0aa263ba\") " Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.736020 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c077ba19-0c1d-469a-8614-90ec0aa263ba-logs" (OuterVolumeSpecName: "logs") pod "c077ba19-0c1d-469a-8614-90ec0aa263ba" (UID: "c077ba19-0c1d-469a-8614-90ec0aa263ba"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.736213 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c077ba19-0c1d-469a-8614-90ec0aa263ba-combined-ca-bundle\") pod \"c077ba19-0c1d-469a-8614-90ec0aa263ba\" (UID: \"c077ba19-0c1d-469a-8614-90ec0aa263ba\") " Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.736530 4727 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c077ba19-0c1d-469a-8614-90ec0aa263ba-logs\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.741409 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c077ba19-0c1d-469a-8614-90ec0aa263ba-kube-api-access-kmmbz" (OuterVolumeSpecName: "kube-api-access-kmmbz") pod "c077ba19-0c1d-469a-8614-90ec0aa263ba" (UID: "c077ba19-0c1d-469a-8614-90ec0aa263ba"). InnerVolumeSpecName "kube-api-access-kmmbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.768542 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c077ba19-0c1d-469a-8614-90ec0aa263ba-config-data" (OuterVolumeSpecName: "config-data") pod "c077ba19-0c1d-469a-8614-90ec0aa263ba" (UID: "c077ba19-0c1d-469a-8614-90ec0aa263ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.796524 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c077ba19-0c1d-469a-8614-90ec0aa263ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c077ba19-0c1d-469a-8614-90ec0aa263ba" (UID: "c077ba19-0c1d-469a-8614-90ec0aa263ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.802790 4727 generic.go:334] "Generic (PLEG): container finished" podID="c077ba19-0c1d-469a-8614-90ec0aa263ba" containerID="02290efc5268733d01bfd3357aee6706849eff14f8b80eb9f7f3bbb445774acc" exitCode=0 Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.802835 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c077ba19-0c1d-469a-8614-90ec0aa263ba","Type":"ContainerDied","Data":"02290efc5268733d01bfd3357aee6706849eff14f8b80eb9f7f3bbb445774acc"} Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.802862 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c077ba19-0c1d-469a-8614-90ec0aa263ba","Type":"ContainerDied","Data":"7f21797b7672e143e0584fa65b55ab7821a2f84c47a689c58456ad08eee7b2f5"} Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.802864 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.802882 4727 scope.go:117] "RemoveContainer" containerID="02290efc5268733d01bfd3357aee6706849eff14f8b80eb9f7f3bbb445774acc" Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.826402 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c077ba19-0c1d-469a-8614-90ec0aa263ba-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c077ba19-0c1d-469a-8614-90ec0aa263ba" (UID: "c077ba19-0c1d-469a-8614-90ec0aa263ba"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.838315 4727 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c077ba19-0c1d-469a-8614-90ec0aa263ba-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.838355 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c077ba19-0c1d-469a-8614-90ec0aa263ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.838366 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c077ba19-0c1d-469a-8614-90ec0aa263ba-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.838377 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmmbz\" (UniqueName: \"kubernetes.io/projected/c077ba19-0c1d-469a-8614-90ec0aa263ba-kube-api-access-kmmbz\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.860193 4727 scope.go:117] "RemoveContainer" containerID="2f7a3c16260c74c057b6dbc39427c0d0c3462479ed2bffb33d9f5ed26df93269" Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.888740 4727 scope.go:117] "RemoveContainer" containerID="02290efc5268733d01bfd3357aee6706849eff14f8b80eb9f7f3bbb445774acc" Oct 01 12:56:18 crc kubenswrapper[4727]: E1001 12:56:18.889305 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02290efc5268733d01bfd3357aee6706849eff14f8b80eb9f7f3bbb445774acc\": container with ID starting with 02290efc5268733d01bfd3357aee6706849eff14f8b80eb9f7f3bbb445774acc not found: ID does not exist" containerID="02290efc5268733d01bfd3357aee6706849eff14f8b80eb9f7f3bbb445774acc" Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.889347 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02290efc5268733d01bfd3357aee6706849eff14f8b80eb9f7f3bbb445774acc"} err="failed to get container status \"02290efc5268733d01bfd3357aee6706849eff14f8b80eb9f7f3bbb445774acc\": rpc error: code = NotFound desc = could not find container \"02290efc5268733d01bfd3357aee6706849eff14f8b80eb9f7f3bbb445774acc\": container with ID starting with 02290efc5268733d01bfd3357aee6706849eff14f8b80eb9f7f3bbb445774acc not found: ID does not exist" Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.889376 4727 scope.go:117] "RemoveContainer" containerID="2f7a3c16260c74c057b6dbc39427c0d0c3462479ed2bffb33d9f5ed26df93269" Oct 01 12:56:18 crc kubenswrapper[4727]: E1001 12:56:18.889796 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f7a3c16260c74c057b6dbc39427c0d0c3462479ed2bffb33d9f5ed26df93269\": container with ID starting with 2f7a3c16260c74c057b6dbc39427c0d0c3462479ed2bffb33d9f5ed26df93269 not found: ID does not exist" containerID="2f7a3c16260c74c057b6dbc39427c0d0c3462479ed2bffb33d9f5ed26df93269" Oct 01 12:56:18 crc kubenswrapper[4727]: I1001 12:56:18.889830 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f7a3c16260c74c057b6dbc39427c0d0c3462479ed2bffb33d9f5ed26df93269"} err="failed to get container status \"2f7a3c16260c74c057b6dbc39427c0d0c3462479ed2bffb33d9f5ed26df93269\": rpc error: code = NotFound desc = could not find container \"2f7a3c16260c74c057b6dbc39427c0d0c3462479ed2bffb33d9f5ed26df93269\": container with ID starting with 2f7a3c16260c74c057b6dbc39427c0d0c3462479ed2bffb33d9f5ed26df93269 not found: ID does not exist" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.134609 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.145733 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.153348 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:56:19 crc kubenswrapper[4727]: E1001 12:56:19.153742 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a62fbedb-de7c-424a-bdec-92639359a708" containerName="init" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.153762 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="a62fbedb-de7c-424a-bdec-92639359a708" containerName="init" Oct 01 12:56:19 crc kubenswrapper[4727]: E1001 12:56:19.153777 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc24fafb-1942-435d-8dd4-412ef1b4ebd6" containerName="nova-manage" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.153784 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc24fafb-1942-435d-8dd4-412ef1b4ebd6" containerName="nova-manage" Oct 01 12:56:19 crc kubenswrapper[4727]: E1001 12:56:19.153799 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c077ba19-0c1d-469a-8614-90ec0aa263ba" containerName="nova-metadata-log" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.153806 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="c077ba19-0c1d-469a-8614-90ec0aa263ba" containerName="nova-metadata-log" Oct 01 12:56:19 crc kubenswrapper[4727]: E1001 12:56:19.153829 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a62fbedb-de7c-424a-bdec-92639359a708" containerName="dnsmasq-dns" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.153835 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="a62fbedb-de7c-424a-bdec-92639359a708" containerName="dnsmasq-dns" Oct 01 12:56:19 crc kubenswrapper[4727]: E1001 12:56:19.153850 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c077ba19-0c1d-469a-8614-90ec0aa263ba" containerName="nova-metadata-metadata" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.153858 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="c077ba19-0c1d-469a-8614-90ec0aa263ba" containerName="nova-metadata-metadata" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.154045 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="a62fbedb-de7c-424a-bdec-92639359a708" containerName="dnsmasq-dns" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.154060 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="c077ba19-0c1d-469a-8614-90ec0aa263ba" containerName="nova-metadata-log" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.154074 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="c077ba19-0c1d-469a-8614-90ec0aa263ba" containerName="nova-metadata-metadata" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.154094 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc24fafb-1942-435d-8dd4-412ef1b4ebd6" containerName="nova-manage" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.155062 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.156798 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.167663 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.169582 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.346143 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca\") " pod="openstack/nova-metadata-0" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.346514 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca-logs\") pod \"nova-metadata-0\" (UID: \"5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca\") " pod="openstack/nova-metadata-0" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.346550 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca\") " pod="openstack/nova-metadata-0" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.346625 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtp4x\" (UniqueName: \"kubernetes.io/projected/5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca-kube-api-access-qtp4x\") pod \"nova-metadata-0\" (UID: \"5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca\") " pod="openstack/nova-metadata-0" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.346681 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca-config-data\") pod \"nova-metadata-0\" (UID: \"5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca\") " pod="openstack/nova-metadata-0" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.447584 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca\") " pod="openstack/nova-metadata-0" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.447623 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca-logs\") pod \"nova-metadata-0\" (UID: \"5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca\") " pod="openstack/nova-metadata-0" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.447653 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca\") " pod="openstack/nova-metadata-0" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.447720 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtp4x\" (UniqueName: \"kubernetes.io/projected/5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca-kube-api-access-qtp4x\") pod \"nova-metadata-0\" (UID: \"5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca\") " pod="openstack/nova-metadata-0" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.447771 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca-config-data\") pod \"nova-metadata-0\" (UID: \"5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca\") " pod="openstack/nova-metadata-0" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.448432 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca-logs\") pod \"nova-metadata-0\" (UID: \"5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca\") " pod="openstack/nova-metadata-0" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.451754 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca\") " pod="openstack/nova-metadata-0" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.452571 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca\") " pod="openstack/nova-metadata-0" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.453110 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca-config-data\") pod \"nova-metadata-0\" (UID: \"5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca\") " pod="openstack/nova-metadata-0" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.468384 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtp4x\" (UniqueName: \"kubernetes.io/projected/5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca-kube-api-access-qtp4x\") pod \"nova-metadata-0\" (UID: \"5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca\") " pod="openstack/nova-metadata-0" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.523895 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.788187 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.819497 4727 generic.go:334] "Generic (PLEG): container finished" podID="228ffba7-ed93-43c8-b1df-c9c68c337461" containerID="1ec5510902c13ae349866ca0c59a0826811487cf418fe9e0fc2f710a6e97e249" exitCode=0 Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.819563 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"228ffba7-ed93-43c8-b1df-c9c68c337461","Type":"ContainerDied","Data":"1ec5510902c13ae349866ca0c59a0826811487cf418fe9e0fc2f710a6e97e249"} Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.819577 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.819602 4727 scope.go:117] "RemoveContainer" containerID="1ec5510902c13ae349866ca0c59a0826811487cf418fe9e0fc2f710a6e97e249" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.819590 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"228ffba7-ed93-43c8-b1df-c9c68c337461","Type":"ContainerDied","Data":"05e117d5b5718e5a3b481cefa664a8769616fb3d532d3fc1871e8d5770ddf384"} Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.858987 4727 scope.go:117] "RemoveContainer" containerID="1ec5510902c13ae349866ca0c59a0826811487cf418fe9e0fc2f710a6e97e249" Oct 01 12:56:19 crc kubenswrapper[4727]: E1001 12:56:19.859435 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ec5510902c13ae349866ca0c59a0826811487cf418fe9e0fc2f710a6e97e249\": container with ID starting with 1ec5510902c13ae349866ca0c59a0826811487cf418fe9e0fc2f710a6e97e249 not found: ID does not exist" containerID="1ec5510902c13ae349866ca0c59a0826811487cf418fe9e0fc2f710a6e97e249" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.859464 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ec5510902c13ae349866ca0c59a0826811487cf418fe9e0fc2f710a6e97e249"} err="failed to get container status \"1ec5510902c13ae349866ca0c59a0826811487cf418fe9e0fc2f710a6e97e249\": rpc error: code = NotFound desc = could not find container \"1ec5510902c13ae349866ca0c59a0826811487cf418fe9e0fc2f710a6e97e249\": container with ID starting with 1ec5510902c13ae349866ca0c59a0826811487cf418fe9e0fc2f710a6e97e249 not found: ID does not exist" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.959401 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4kns\" (UniqueName: \"kubernetes.io/projected/228ffba7-ed93-43c8-b1df-c9c68c337461-kube-api-access-k4kns\") pod \"228ffba7-ed93-43c8-b1df-c9c68c337461\" (UID: \"228ffba7-ed93-43c8-b1df-c9c68c337461\") " Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.960498 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/228ffba7-ed93-43c8-b1df-c9c68c337461-config-data\") pod \"228ffba7-ed93-43c8-b1df-c9c68c337461\" (UID: \"228ffba7-ed93-43c8-b1df-c9c68c337461\") " Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.960645 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/228ffba7-ed93-43c8-b1df-c9c68c337461-combined-ca-bundle\") pod \"228ffba7-ed93-43c8-b1df-c9c68c337461\" (UID: \"228ffba7-ed93-43c8-b1df-c9c68c337461\") " Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.965543 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/228ffba7-ed93-43c8-b1df-c9c68c337461-kube-api-access-k4kns" (OuterVolumeSpecName: "kube-api-access-k4kns") pod "228ffba7-ed93-43c8-b1df-c9c68c337461" (UID: "228ffba7-ed93-43c8-b1df-c9c68c337461"). InnerVolumeSpecName "kube-api-access-k4kns". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.990908 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228ffba7-ed93-43c8-b1df-c9c68c337461-config-data" (OuterVolumeSpecName: "config-data") pod "228ffba7-ed93-43c8-b1df-c9c68c337461" (UID: "228ffba7-ed93-43c8-b1df-c9c68c337461"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:19 crc kubenswrapper[4727]: I1001 12:56:19.996155 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228ffba7-ed93-43c8-b1df-c9c68c337461-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "228ffba7-ed93-43c8-b1df-c9c68c337461" (UID: "228ffba7-ed93-43c8-b1df-c9c68c337461"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.040881 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.063596 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/228ffba7-ed93-43c8-b1df-c9c68c337461-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.063648 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/228ffba7-ed93-43c8-b1df-c9c68c337461-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.063664 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4kns\" (UniqueName: \"kubernetes.io/projected/228ffba7-ed93-43c8-b1df-c9c68c337461-kube-api-access-k4kns\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.179083 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.215107 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.235077 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:56:20 crc kubenswrapper[4727]: E1001 12:56:20.235716 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="228ffba7-ed93-43c8-b1df-c9c68c337461" containerName="nova-scheduler-scheduler" Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.235730 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="228ffba7-ed93-43c8-b1df-c9c68c337461" containerName="nova-scheduler-scheduler" Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.235928 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="228ffba7-ed93-43c8-b1df-c9c68c337461" containerName="nova-scheduler-scheduler" Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.236636 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.245651 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.250054 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.371978 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79a1e4c0-8104-4710-91ca-e9a32c934c9b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"79a1e4c0-8104-4710-91ca-e9a32c934c9b\") " pod="openstack/nova-scheduler-0" Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.372068 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a1e4c0-8104-4710-91ca-e9a32c934c9b-config-data\") pod \"nova-scheduler-0\" (UID: \"79a1e4c0-8104-4710-91ca-e9a32c934c9b\") " pod="openstack/nova-scheduler-0" Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.372128 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4k4q\" (UniqueName: \"kubernetes.io/projected/79a1e4c0-8104-4710-91ca-e9a32c934c9b-kube-api-access-h4k4q\") pod \"nova-scheduler-0\" (UID: \"79a1e4c0-8104-4710-91ca-e9a32c934c9b\") " pod="openstack/nova-scheduler-0" Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.392475 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="228ffba7-ed93-43c8-b1df-c9c68c337461" path="/var/lib/kubelet/pods/228ffba7-ed93-43c8-b1df-c9c68c337461/volumes" Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.393231 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c077ba19-0c1d-469a-8614-90ec0aa263ba" path="/var/lib/kubelet/pods/c077ba19-0c1d-469a-8614-90ec0aa263ba/volumes" Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.473705 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79a1e4c0-8104-4710-91ca-e9a32c934c9b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"79a1e4c0-8104-4710-91ca-e9a32c934c9b\") " pod="openstack/nova-scheduler-0" Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.473749 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a1e4c0-8104-4710-91ca-e9a32c934c9b-config-data\") pod \"nova-scheduler-0\" (UID: \"79a1e4c0-8104-4710-91ca-e9a32c934c9b\") " pod="openstack/nova-scheduler-0" Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.473801 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4k4q\" (UniqueName: \"kubernetes.io/projected/79a1e4c0-8104-4710-91ca-e9a32c934c9b-kube-api-access-h4k4q\") pod \"nova-scheduler-0\" (UID: \"79a1e4c0-8104-4710-91ca-e9a32c934c9b\") " pod="openstack/nova-scheduler-0" Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.478144 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79a1e4c0-8104-4710-91ca-e9a32c934c9b-config-data\") pod \"nova-scheduler-0\" (UID: \"79a1e4c0-8104-4710-91ca-e9a32c934c9b\") " pod="openstack/nova-scheduler-0" Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.478353 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79a1e4c0-8104-4710-91ca-e9a32c934c9b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"79a1e4c0-8104-4710-91ca-e9a32c934c9b\") " pod="openstack/nova-scheduler-0" Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.501670 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4k4q\" (UniqueName: \"kubernetes.io/projected/79a1e4c0-8104-4710-91ca-e9a32c934c9b-kube-api-access-h4k4q\") pod \"nova-scheduler-0\" (UID: \"79a1e4c0-8104-4710-91ca-e9a32c934c9b\") " pod="openstack/nova-scheduler-0" Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.581106 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.835808 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca","Type":"ContainerStarted","Data":"0b4e50a2fde4f859ff1a7176ed00de675327a7c81fcca1a29f2c3f3c561570ed"} Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.836081 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca","Type":"ContainerStarted","Data":"e073f29d364b8631ef49441eed9e24bb787d12b2e4e80f1dea457837b6d8b9b4"} Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.836092 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca","Type":"ContainerStarted","Data":"b0157575f4f33261f342e6cccbe5a6891393e47394fde5dd2b3dd0e3832795e5"} Oct 01 12:56:20 crc kubenswrapper[4727]: I1001 12:56:20.860201 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.860181149 podStartE2EDuration="1.860181149s" podCreationTimestamp="2025-10-01 12:56:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:56:20.855579874 +0000 UTC m=+1159.176934701" watchObservedRunningTime="2025-10-01 12:56:20.860181149 +0000 UTC m=+1159.181535986" Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.029128 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 12:56:21 crc kubenswrapper[4727]: E1001 12:56:21.099915 4727 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27b755da_d064_4481_b856_4b51bb15cecb.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27b755da_d064_4481_b856_4b51bb15cecb.slice/crio-ab26382c2d1a3ca68c36b50aeb8656829916d467a5c3db235041143e82fda2c6\": RecentStats: unable to find data in memory cache]" Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.646617 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.798660 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glmqf\" (UniqueName: \"kubernetes.io/projected/428cf4e3-e21c-44ab-a562-4aee1e36956c-kube-api-access-glmqf\") pod \"428cf4e3-e21c-44ab-a562-4aee1e36956c\" (UID: \"428cf4e3-e21c-44ab-a562-4aee1e36956c\") " Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.798786 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/428cf4e3-e21c-44ab-a562-4aee1e36956c-logs\") pod \"428cf4e3-e21c-44ab-a562-4aee1e36956c\" (UID: \"428cf4e3-e21c-44ab-a562-4aee1e36956c\") " Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.798896 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/428cf4e3-e21c-44ab-a562-4aee1e36956c-combined-ca-bundle\") pod \"428cf4e3-e21c-44ab-a562-4aee1e36956c\" (UID: \"428cf4e3-e21c-44ab-a562-4aee1e36956c\") " Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.798931 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/428cf4e3-e21c-44ab-a562-4aee1e36956c-config-data\") pod \"428cf4e3-e21c-44ab-a562-4aee1e36956c\" (UID: \"428cf4e3-e21c-44ab-a562-4aee1e36956c\") " Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.798968 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/428cf4e3-e21c-44ab-a562-4aee1e36956c-public-tls-certs\") pod \"428cf4e3-e21c-44ab-a562-4aee1e36956c\" (UID: \"428cf4e3-e21c-44ab-a562-4aee1e36956c\") " Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.799054 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/428cf4e3-e21c-44ab-a562-4aee1e36956c-internal-tls-certs\") pod \"428cf4e3-e21c-44ab-a562-4aee1e36956c\" (UID: \"428cf4e3-e21c-44ab-a562-4aee1e36956c\") " Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.799811 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/428cf4e3-e21c-44ab-a562-4aee1e36956c-logs" (OuterVolumeSpecName: "logs") pod "428cf4e3-e21c-44ab-a562-4aee1e36956c" (UID: "428cf4e3-e21c-44ab-a562-4aee1e36956c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.805828 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/428cf4e3-e21c-44ab-a562-4aee1e36956c-kube-api-access-glmqf" (OuterVolumeSpecName: "kube-api-access-glmqf") pod "428cf4e3-e21c-44ab-a562-4aee1e36956c" (UID: "428cf4e3-e21c-44ab-a562-4aee1e36956c"). InnerVolumeSpecName "kube-api-access-glmqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.826549 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/428cf4e3-e21c-44ab-a562-4aee1e36956c-config-data" (OuterVolumeSpecName: "config-data") pod "428cf4e3-e21c-44ab-a562-4aee1e36956c" (UID: "428cf4e3-e21c-44ab-a562-4aee1e36956c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.841945 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/428cf4e3-e21c-44ab-a562-4aee1e36956c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "428cf4e3-e21c-44ab-a562-4aee1e36956c" (UID: "428cf4e3-e21c-44ab-a562-4aee1e36956c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.849872 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"79a1e4c0-8104-4710-91ca-e9a32c934c9b","Type":"ContainerStarted","Data":"ebbf425ed124eaa361eda47b1c5faec91e2018db87eed8f9c6367679d2cec1dc"} Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.849920 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"79a1e4c0-8104-4710-91ca-e9a32c934c9b","Type":"ContainerStarted","Data":"f32c759c657ff20f042a08962629237992aa8fff4a3b3a8f8c498d65f7c8da5f"} Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.853372 4727 generic.go:334] "Generic (PLEG): container finished" podID="428cf4e3-e21c-44ab-a562-4aee1e36956c" containerID="565e7e9ad1ec87eeffae7ef0ed1054ed9b7e2365906c4734a90040093a9c1f82" exitCode=0 Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.853426 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.853468 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"428cf4e3-e21c-44ab-a562-4aee1e36956c","Type":"ContainerDied","Data":"565e7e9ad1ec87eeffae7ef0ed1054ed9b7e2365906c4734a90040093a9c1f82"} Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.853505 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"428cf4e3-e21c-44ab-a562-4aee1e36956c","Type":"ContainerDied","Data":"4a607ee7e212cf4cee46481bb180b96dcf5f0cbad9df248f75b0252c69303893"} Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.853526 4727 scope.go:117] "RemoveContainer" containerID="565e7e9ad1ec87eeffae7ef0ed1054ed9b7e2365906c4734a90040093a9c1f82" Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.865292 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/428cf4e3-e21c-44ab-a562-4aee1e36956c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "428cf4e3-e21c-44ab-a562-4aee1e36956c" (UID: "428cf4e3-e21c-44ab-a562-4aee1e36956c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.867046 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.867029603 podStartE2EDuration="1.867029603s" podCreationTimestamp="2025-10-01 12:56:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:56:21.866100694 +0000 UTC m=+1160.187455551" watchObservedRunningTime="2025-10-01 12:56:21.867029603 +0000 UTC m=+1160.188384440" Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.868421 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/428cf4e3-e21c-44ab-a562-4aee1e36956c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "428cf4e3-e21c-44ab-a562-4aee1e36956c" (UID: "428cf4e3-e21c-44ab-a562-4aee1e36956c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.887203 4727 scope.go:117] "RemoveContainer" containerID="d664c28112b5cb1d8e53ad5f3d65d60c2384718a9ba732664fdc90a275b15398" Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.901652 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/428cf4e3-e21c-44ab-a562-4aee1e36956c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.901687 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/428cf4e3-e21c-44ab-a562-4aee1e36956c-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.901696 4727 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/428cf4e3-e21c-44ab-a562-4aee1e36956c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.901706 4727 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/428cf4e3-e21c-44ab-a562-4aee1e36956c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.901716 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glmqf\" (UniqueName: \"kubernetes.io/projected/428cf4e3-e21c-44ab-a562-4aee1e36956c-kube-api-access-glmqf\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.901725 4727 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/428cf4e3-e21c-44ab-a562-4aee1e36956c-logs\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.906570 4727 scope.go:117] "RemoveContainer" containerID="565e7e9ad1ec87eeffae7ef0ed1054ed9b7e2365906c4734a90040093a9c1f82" Oct 01 12:56:21 crc kubenswrapper[4727]: E1001 12:56:21.907180 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"565e7e9ad1ec87eeffae7ef0ed1054ed9b7e2365906c4734a90040093a9c1f82\": container with ID starting with 565e7e9ad1ec87eeffae7ef0ed1054ed9b7e2365906c4734a90040093a9c1f82 not found: ID does not exist" containerID="565e7e9ad1ec87eeffae7ef0ed1054ed9b7e2365906c4734a90040093a9c1f82" Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.907214 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"565e7e9ad1ec87eeffae7ef0ed1054ed9b7e2365906c4734a90040093a9c1f82"} err="failed to get container status \"565e7e9ad1ec87eeffae7ef0ed1054ed9b7e2365906c4734a90040093a9c1f82\": rpc error: code = NotFound desc = could not find container \"565e7e9ad1ec87eeffae7ef0ed1054ed9b7e2365906c4734a90040093a9c1f82\": container with ID starting with 565e7e9ad1ec87eeffae7ef0ed1054ed9b7e2365906c4734a90040093a9c1f82 not found: ID does not exist" Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.907235 4727 scope.go:117] "RemoveContainer" containerID="d664c28112b5cb1d8e53ad5f3d65d60c2384718a9ba732664fdc90a275b15398" Oct 01 12:56:21 crc kubenswrapper[4727]: E1001 12:56:21.907499 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d664c28112b5cb1d8e53ad5f3d65d60c2384718a9ba732664fdc90a275b15398\": container with ID starting with d664c28112b5cb1d8e53ad5f3d65d60c2384718a9ba732664fdc90a275b15398 not found: ID does not exist" containerID="d664c28112b5cb1d8e53ad5f3d65d60c2384718a9ba732664fdc90a275b15398" Oct 01 12:56:21 crc kubenswrapper[4727]: I1001 12:56:21.907527 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d664c28112b5cb1d8e53ad5f3d65d60c2384718a9ba732664fdc90a275b15398"} err="failed to get container status \"d664c28112b5cb1d8e53ad5f3d65d60c2384718a9ba732664fdc90a275b15398\": rpc error: code = NotFound desc = could not find container \"d664c28112b5cb1d8e53ad5f3d65d60c2384718a9ba732664fdc90a275b15398\": container with ID starting with d664c28112b5cb1d8e53ad5f3d65d60c2384718a9ba732664fdc90a275b15398 not found: ID does not exist" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.190397 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.195309 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.217442 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:22 crc kubenswrapper[4727]: E1001 12:56:22.217810 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="428cf4e3-e21c-44ab-a562-4aee1e36956c" containerName="nova-api-api" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.217825 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="428cf4e3-e21c-44ab-a562-4aee1e36956c" containerName="nova-api-api" Oct 01 12:56:22 crc kubenswrapper[4727]: E1001 12:56:22.217851 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="428cf4e3-e21c-44ab-a562-4aee1e36956c" containerName="nova-api-log" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.217859 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="428cf4e3-e21c-44ab-a562-4aee1e36956c" containerName="nova-api-log" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.218049 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="428cf4e3-e21c-44ab-a562-4aee1e36956c" containerName="nova-api-api" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.218075 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="428cf4e3-e21c-44ab-a562-4aee1e36956c" containerName="nova-api-log" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.223886 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.227832 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.228143 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.228317 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.237972 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.308991 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1cdebdb-ee16-4183-a5ca-c80527ec9d5e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d1cdebdb-ee16-4183-a5ca-c80527ec9d5e\") " pod="openstack/nova-api-0" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.309385 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1cdebdb-ee16-4183-a5ca-c80527ec9d5e-config-data\") pod \"nova-api-0\" (UID: \"d1cdebdb-ee16-4183-a5ca-c80527ec9d5e\") " pod="openstack/nova-api-0" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.309498 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1cdebdb-ee16-4183-a5ca-c80527ec9d5e-public-tls-certs\") pod \"nova-api-0\" (UID: \"d1cdebdb-ee16-4183-a5ca-c80527ec9d5e\") " pod="openstack/nova-api-0" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.309557 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q92sj\" (UniqueName: \"kubernetes.io/projected/d1cdebdb-ee16-4183-a5ca-c80527ec9d5e-kube-api-access-q92sj\") pod \"nova-api-0\" (UID: \"d1cdebdb-ee16-4183-a5ca-c80527ec9d5e\") " pod="openstack/nova-api-0" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.309706 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1cdebdb-ee16-4183-a5ca-c80527ec9d5e-logs\") pod \"nova-api-0\" (UID: \"d1cdebdb-ee16-4183-a5ca-c80527ec9d5e\") " pod="openstack/nova-api-0" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.309754 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1cdebdb-ee16-4183-a5ca-c80527ec9d5e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d1cdebdb-ee16-4183-a5ca-c80527ec9d5e\") " pod="openstack/nova-api-0" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.382097 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="428cf4e3-e21c-44ab-a562-4aee1e36956c" path="/var/lib/kubelet/pods/428cf4e3-e21c-44ab-a562-4aee1e36956c/volumes" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.411469 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1cdebdb-ee16-4183-a5ca-c80527ec9d5e-public-tls-certs\") pod \"nova-api-0\" (UID: \"d1cdebdb-ee16-4183-a5ca-c80527ec9d5e\") " pod="openstack/nova-api-0" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.411525 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q92sj\" (UniqueName: \"kubernetes.io/projected/d1cdebdb-ee16-4183-a5ca-c80527ec9d5e-kube-api-access-q92sj\") pod \"nova-api-0\" (UID: \"d1cdebdb-ee16-4183-a5ca-c80527ec9d5e\") " pod="openstack/nova-api-0" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.411565 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1cdebdb-ee16-4183-a5ca-c80527ec9d5e-logs\") pod \"nova-api-0\" (UID: \"d1cdebdb-ee16-4183-a5ca-c80527ec9d5e\") " pod="openstack/nova-api-0" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.411594 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1cdebdb-ee16-4183-a5ca-c80527ec9d5e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d1cdebdb-ee16-4183-a5ca-c80527ec9d5e\") " pod="openstack/nova-api-0" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.412098 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1cdebdb-ee16-4183-a5ca-c80527ec9d5e-logs\") pod \"nova-api-0\" (UID: \"d1cdebdb-ee16-4183-a5ca-c80527ec9d5e\") " pod="openstack/nova-api-0" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.412099 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1cdebdb-ee16-4183-a5ca-c80527ec9d5e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d1cdebdb-ee16-4183-a5ca-c80527ec9d5e\") " pod="openstack/nova-api-0" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.412187 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1cdebdb-ee16-4183-a5ca-c80527ec9d5e-config-data\") pod \"nova-api-0\" (UID: \"d1cdebdb-ee16-4183-a5ca-c80527ec9d5e\") " pod="openstack/nova-api-0" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.416700 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1cdebdb-ee16-4183-a5ca-c80527ec9d5e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d1cdebdb-ee16-4183-a5ca-c80527ec9d5e\") " pod="openstack/nova-api-0" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.416921 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1cdebdb-ee16-4183-a5ca-c80527ec9d5e-config-data\") pod \"nova-api-0\" (UID: \"d1cdebdb-ee16-4183-a5ca-c80527ec9d5e\") " pod="openstack/nova-api-0" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.417633 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1cdebdb-ee16-4183-a5ca-c80527ec9d5e-public-tls-certs\") pod \"nova-api-0\" (UID: \"d1cdebdb-ee16-4183-a5ca-c80527ec9d5e\") " pod="openstack/nova-api-0" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.418831 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1cdebdb-ee16-4183-a5ca-c80527ec9d5e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d1cdebdb-ee16-4183-a5ca-c80527ec9d5e\") " pod="openstack/nova-api-0" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.432034 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q92sj\" (UniqueName: \"kubernetes.io/projected/d1cdebdb-ee16-4183-a5ca-c80527ec9d5e-kube-api-access-q92sj\") pod \"nova-api-0\" (UID: \"d1cdebdb-ee16-4183-a5ca-c80527ec9d5e\") " pod="openstack/nova-api-0" Oct 01 12:56:22 crc kubenswrapper[4727]: I1001 12:56:22.572892 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 12:56:23 crc kubenswrapper[4727]: I1001 12:56:23.039456 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 12:56:23 crc kubenswrapper[4727]: W1001 12:56:23.043450 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1cdebdb_ee16_4183_a5ca_c80527ec9d5e.slice/crio-2fe96a243badda5cf3fb5cc4b9f2ec580f3c6e7856793908d0cb076d21b16389 WatchSource:0}: Error finding container 2fe96a243badda5cf3fb5cc4b9f2ec580f3c6e7856793908d0cb076d21b16389: Status 404 returned error can't find the container with id 2fe96a243badda5cf3fb5cc4b9f2ec580f3c6e7856793908d0cb076d21b16389 Oct 01 12:56:23 crc kubenswrapper[4727]: I1001 12:56:23.873903 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d1cdebdb-ee16-4183-a5ca-c80527ec9d5e","Type":"ContainerStarted","Data":"75b1c6965e99e2ef7efb4f1808fbbcc267eefd12a2c01522d80c5645fb678b75"} Oct 01 12:56:23 crc kubenswrapper[4727]: I1001 12:56:23.874306 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d1cdebdb-ee16-4183-a5ca-c80527ec9d5e","Type":"ContainerStarted","Data":"1b058b40b3488fc91661a05cf08fa4407d54b4960e28a4eef87398e35c09d74d"} Oct 01 12:56:23 crc kubenswrapper[4727]: I1001 12:56:23.874322 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d1cdebdb-ee16-4183-a5ca-c80527ec9d5e","Type":"ContainerStarted","Data":"2fe96a243badda5cf3fb5cc4b9f2ec580f3c6e7856793908d0cb076d21b16389"} Oct 01 12:56:23 crc kubenswrapper[4727]: I1001 12:56:23.894159 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.8941369940000001 podStartE2EDuration="1.894136994s" podCreationTimestamp="2025-10-01 12:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:56:23.889995064 +0000 UTC m=+1162.211349901" watchObservedRunningTime="2025-10-01 12:56:23.894136994 +0000 UTC m=+1162.215491851" Oct 01 12:56:24 crc kubenswrapper[4727]: I1001 12:56:24.527280 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 12:56:24 crc kubenswrapper[4727]: I1001 12:56:24.527541 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 12:56:25 crc kubenswrapper[4727]: I1001 12:56:25.582487 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 01 12:56:29 crc kubenswrapper[4727]: I1001 12:56:29.524430 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 12:56:29 crc kubenswrapper[4727]: I1001 12:56:29.525061 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 12:56:30 crc kubenswrapper[4727]: I1001 12:56:30.536207 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 12:56:30 crc kubenswrapper[4727]: I1001 12:56:30.536255 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 12:56:30 crc kubenswrapper[4727]: I1001 12:56:30.581671 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 01 12:56:30 crc kubenswrapper[4727]: I1001 12:56:30.608514 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 01 12:56:30 crc kubenswrapper[4727]: I1001 12:56:30.981735 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 01 12:56:31 crc kubenswrapper[4727]: E1001 12:56:31.347895 4727 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27b755da_d064_4481_b856_4b51bb15cecb.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27b755da_d064_4481_b856_4b51bb15cecb.slice/crio-ab26382c2d1a3ca68c36b50aeb8656829916d467a5c3db235041143e82fda2c6\": RecentStats: unable to find data in memory cache]" Oct 01 12:56:32 crc kubenswrapper[4727]: I1001 12:56:32.573366 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 12:56:32 crc kubenswrapper[4727]: I1001 12:56:32.573791 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 12:56:32 crc kubenswrapper[4727]: I1001 12:56:32.940691 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 01 12:56:33 crc kubenswrapper[4727]: I1001 12:56:33.291975 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:56:33 crc kubenswrapper[4727]: I1001 12:56:33.292157 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:56:33 crc kubenswrapper[4727]: I1001 12:56:33.292314 4727 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" Oct 01 12:56:33 crc kubenswrapper[4727]: I1001 12:56:33.293146 4727 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d15726f80d85ac871118ff8508f8fbb90331c1d082df7e96a9adc970ffc70f86"} pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 12:56:33 crc kubenswrapper[4727]: I1001 12:56:33.293217 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" containerID="cri-o://d15726f80d85ac871118ff8508f8fbb90331c1d082df7e96a9adc970ffc70f86" gracePeriod=600 Oct 01 12:56:33 crc kubenswrapper[4727]: I1001 12:56:33.592378 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d1cdebdb-ee16-4183-a5ca-c80527ec9d5e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 12:56:33 crc kubenswrapper[4727]: I1001 12:56:33.592440 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d1cdebdb-ee16-4183-a5ca-c80527ec9d5e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 12:56:33 crc kubenswrapper[4727]: I1001 12:56:33.977589 4727 generic.go:334] "Generic (PLEG): container finished" podID="d18290ae-64a5-44a5-a704-90977d85852b" containerID="d15726f80d85ac871118ff8508f8fbb90331c1d082df7e96a9adc970ffc70f86" exitCode=0 Oct 01 12:56:33 crc kubenswrapper[4727]: I1001 12:56:33.978077 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" event={"ID":"d18290ae-64a5-44a5-a704-90977d85852b","Type":"ContainerDied","Data":"d15726f80d85ac871118ff8508f8fbb90331c1d082df7e96a9adc970ffc70f86"} Oct 01 12:56:33 crc kubenswrapper[4727]: I1001 12:56:33.978127 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" event={"ID":"d18290ae-64a5-44a5-a704-90977d85852b","Type":"ContainerStarted","Data":"cacd2a9209dd857fc1890a57e560a24e6efca70576638e54f6197ee82d5463f5"} Oct 01 12:56:33 crc kubenswrapper[4727]: I1001 12:56:33.978159 4727 scope.go:117] "RemoveContainer" containerID="8ee5ee2e5696638af5bc213bd13dc53b7b85703a971ba03bb8cf933270c1945e" Oct 01 12:56:36 crc kubenswrapper[4727]: I1001 12:56:36.625749 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 12:56:36 crc kubenswrapper[4727]: I1001 12:56:36.626450 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="4226303d-42f1-4267-ac66-5db1def22a4b" containerName="kube-state-metrics" containerID="cri-o://f1fb8e78533bbf52baf5a0c050a858cd1f1c1685dfb78834d97e9dca7b4d4504" gracePeriod=30 Oct 01 12:56:37 crc kubenswrapper[4727]: I1001 12:56:37.012134 4727 generic.go:334] "Generic (PLEG): container finished" podID="4226303d-42f1-4267-ac66-5db1def22a4b" containerID="f1fb8e78533bbf52baf5a0c050a858cd1f1c1685dfb78834d97e9dca7b4d4504" exitCode=2 Oct 01 12:56:37 crc kubenswrapper[4727]: I1001 12:56:37.012362 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4226303d-42f1-4267-ac66-5db1def22a4b","Type":"ContainerDied","Data":"f1fb8e78533bbf52baf5a0c050a858cd1f1c1685dfb78834d97e9dca7b4d4504"} Oct 01 12:56:37 crc kubenswrapper[4727]: I1001 12:56:37.012579 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4226303d-42f1-4267-ac66-5db1def22a4b","Type":"ContainerDied","Data":"a130e9643820d5742c5b4ae56bfff4d563dbe0fcec28cbead0e40bdb41cfedf7"} Oct 01 12:56:37 crc kubenswrapper[4727]: I1001 12:56:37.012605 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a130e9643820d5742c5b4ae56bfff4d563dbe0fcec28cbead0e40bdb41cfedf7" Oct 01 12:56:37 crc kubenswrapper[4727]: I1001 12:56:37.109270 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 12:56:37 crc kubenswrapper[4727]: I1001 12:56:37.118137 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvnns\" (UniqueName: \"kubernetes.io/projected/4226303d-42f1-4267-ac66-5db1def22a4b-kube-api-access-rvnns\") pod \"4226303d-42f1-4267-ac66-5db1def22a4b\" (UID: \"4226303d-42f1-4267-ac66-5db1def22a4b\") " Oct 01 12:56:37 crc kubenswrapper[4727]: I1001 12:56:37.126958 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4226303d-42f1-4267-ac66-5db1def22a4b-kube-api-access-rvnns" (OuterVolumeSpecName: "kube-api-access-rvnns") pod "4226303d-42f1-4267-ac66-5db1def22a4b" (UID: "4226303d-42f1-4267-ac66-5db1def22a4b"). InnerVolumeSpecName "kube-api-access-rvnns". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:37 crc kubenswrapper[4727]: I1001 12:56:37.220419 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvnns\" (UniqueName: \"kubernetes.io/projected/4226303d-42f1-4267-ac66-5db1def22a4b-kube-api-access-rvnns\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.021060 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.061279 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.069169 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.082690 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 12:56:38 crc kubenswrapper[4727]: E1001 12:56:38.084930 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4226303d-42f1-4267-ac66-5db1def22a4b" containerName="kube-state-metrics" Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.084966 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="4226303d-42f1-4267-ac66-5db1def22a4b" containerName="kube-state-metrics" Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.085220 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="4226303d-42f1-4267-ac66-5db1def22a4b" containerName="kube-state-metrics" Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.086814 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.088949 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.090406 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.095559 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.236079 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b6ef1c58-1426-4b49-90ff-9b5ee9cb6890-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b6ef1c58-1426-4b49-90ff-9b5ee9cb6890\") " pod="openstack/kube-state-metrics-0" Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.236230 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6ef1c58-1426-4b49-90ff-9b5ee9cb6890-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b6ef1c58-1426-4b49-90ff-9b5ee9cb6890\") " pod="openstack/kube-state-metrics-0" Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.236276 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frcwj\" (UniqueName: \"kubernetes.io/projected/b6ef1c58-1426-4b49-90ff-9b5ee9cb6890-kube-api-access-frcwj\") pod \"kube-state-metrics-0\" (UID: \"b6ef1c58-1426-4b49-90ff-9b5ee9cb6890\") " pod="openstack/kube-state-metrics-0" Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.236417 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ef1c58-1426-4b49-90ff-9b5ee9cb6890-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b6ef1c58-1426-4b49-90ff-9b5ee9cb6890\") " pod="openstack/kube-state-metrics-0" Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.282477 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.282767 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="92a96edf-502b-4528-a73b-b7fb945d3d80" containerName="ceilometer-central-agent" containerID="cri-o://8a069a2b225cadd85feb1b4208eb9d85eb44cab68c20fedd3f32a9efc2e1577d" gracePeriod=30 Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.282844 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="92a96edf-502b-4528-a73b-b7fb945d3d80" containerName="proxy-httpd" containerID="cri-o://6c48651d2332668b6e6ba84bdbb4eb8fc2a2a34743da42e5f5649d1c59591b28" gracePeriod=30 Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.282895 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="92a96edf-502b-4528-a73b-b7fb945d3d80" containerName="sg-core" containerID="cri-o://f3539bfedbe166f40542484e56d0b55c37e0660807fd3c31668356168a31d3c7" gracePeriod=30 Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.282910 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="92a96edf-502b-4528-a73b-b7fb945d3d80" containerName="ceilometer-notification-agent" containerID="cri-o://ff70d54505331bbe6815204a48ca6d5350a7249e39753e5c312aae6dbfd6fc70" gracePeriod=30 Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.337839 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6ef1c58-1426-4b49-90ff-9b5ee9cb6890-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b6ef1c58-1426-4b49-90ff-9b5ee9cb6890\") " pod="openstack/kube-state-metrics-0" Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.337887 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frcwj\" (UniqueName: \"kubernetes.io/projected/b6ef1c58-1426-4b49-90ff-9b5ee9cb6890-kube-api-access-frcwj\") pod \"kube-state-metrics-0\" (UID: \"b6ef1c58-1426-4b49-90ff-9b5ee9cb6890\") " pod="openstack/kube-state-metrics-0" Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.337927 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ef1c58-1426-4b49-90ff-9b5ee9cb6890-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b6ef1c58-1426-4b49-90ff-9b5ee9cb6890\") " pod="openstack/kube-state-metrics-0" Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.338029 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b6ef1c58-1426-4b49-90ff-9b5ee9cb6890-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b6ef1c58-1426-4b49-90ff-9b5ee9cb6890\") " pod="openstack/kube-state-metrics-0" Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.343680 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b6ef1c58-1426-4b49-90ff-9b5ee9cb6890-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b6ef1c58-1426-4b49-90ff-9b5ee9cb6890\") " pod="openstack/kube-state-metrics-0" Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.343913 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ef1c58-1426-4b49-90ff-9b5ee9cb6890-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b6ef1c58-1426-4b49-90ff-9b5ee9cb6890\") " pod="openstack/kube-state-metrics-0" Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.345528 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6ef1c58-1426-4b49-90ff-9b5ee9cb6890-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b6ef1c58-1426-4b49-90ff-9b5ee9cb6890\") " pod="openstack/kube-state-metrics-0" Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.357738 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frcwj\" (UniqueName: \"kubernetes.io/projected/b6ef1c58-1426-4b49-90ff-9b5ee9cb6890-kube-api-access-frcwj\") pod \"kube-state-metrics-0\" (UID: \"b6ef1c58-1426-4b49-90ff-9b5ee9cb6890\") " pod="openstack/kube-state-metrics-0" Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.384681 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4226303d-42f1-4267-ac66-5db1def22a4b" path="/var/lib/kubelet/pods/4226303d-42f1-4267-ac66-5db1def22a4b/volumes" Oct 01 12:56:38 crc kubenswrapper[4727]: I1001 12:56:38.407471 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 12:56:39 crc kubenswrapper[4727]: I1001 12:56:38.891499 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 12:56:39 crc kubenswrapper[4727]: W1001 12:56:38.896653 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6ef1c58_1426_4b49_90ff_9b5ee9cb6890.slice/crio-878bb83299b7f8353c2bb08c6fe70b100f33939e0812e46f4e57d4ca5501a5a9 WatchSource:0}: Error finding container 878bb83299b7f8353c2bb08c6fe70b100f33939e0812e46f4e57d4ca5501a5a9: Status 404 returned error can't find the container with id 878bb83299b7f8353c2bb08c6fe70b100f33939e0812e46f4e57d4ca5501a5a9 Oct 01 12:56:39 crc kubenswrapper[4727]: I1001 12:56:39.040411 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b6ef1c58-1426-4b49-90ff-9b5ee9cb6890","Type":"ContainerStarted","Data":"878bb83299b7f8353c2bb08c6fe70b100f33939e0812e46f4e57d4ca5501a5a9"} Oct 01 12:56:39 crc kubenswrapper[4727]: I1001 12:56:39.043505 4727 generic.go:334] "Generic (PLEG): container finished" podID="92a96edf-502b-4528-a73b-b7fb945d3d80" containerID="6c48651d2332668b6e6ba84bdbb4eb8fc2a2a34743da42e5f5649d1c59591b28" exitCode=0 Oct 01 12:56:39 crc kubenswrapper[4727]: I1001 12:56:39.043553 4727 generic.go:334] "Generic (PLEG): container finished" podID="92a96edf-502b-4528-a73b-b7fb945d3d80" containerID="f3539bfedbe166f40542484e56d0b55c37e0660807fd3c31668356168a31d3c7" exitCode=2 Oct 01 12:56:39 crc kubenswrapper[4727]: I1001 12:56:39.043564 4727 generic.go:334] "Generic (PLEG): container finished" podID="92a96edf-502b-4528-a73b-b7fb945d3d80" containerID="8a069a2b225cadd85feb1b4208eb9d85eb44cab68c20fedd3f32a9efc2e1577d" exitCode=0 Oct 01 12:56:39 crc kubenswrapper[4727]: I1001 12:56:39.043604 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92a96edf-502b-4528-a73b-b7fb945d3d80","Type":"ContainerDied","Data":"6c48651d2332668b6e6ba84bdbb4eb8fc2a2a34743da42e5f5649d1c59591b28"} Oct 01 12:56:39 crc kubenswrapper[4727]: I1001 12:56:39.043634 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92a96edf-502b-4528-a73b-b7fb945d3d80","Type":"ContainerDied","Data":"f3539bfedbe166f40542484e56d0b55c37e0660807fd3c31668356168a31d3c7"} Oct 01 12:56:39 crc kubenswrapper[4727]: I1001 12:56:39.043648 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92a96edf-502b-4528-a73b-b7fb945d3d80","Type":"ContainerDied","Data":"8a069a2b225cadd85feb1b4208eb9d85eb44cab68c20fedd3f32a9efc2e1577d"} Oct 01 12:56:39 crc kubenswrapper[4727]: I1001 12:56:39.530024 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 12:56:39 crc kubenswrapper[4727]: I1001 12:56:39.536024 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 12:56:39 crc kubenswrapper[4727]: I1001 12:56:39.540981 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 12:56:40 crc kubenswrapper[4727]: I1001 12:56:40.053637 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b6ef1c58-1426-4b49-90ff-9b5ee9cb6890","Type":"ContainerStarted","Data":"bb93fcb4fee965b183faaf931aaafc7286da3adf594dfb681fb2718101404c8f"} Oct 01 12:56:40 crc kubenswrapper[4727]: I1001 12:56:40.055121 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 01 12:56:40 crc kubenswrapper[4727]: I1001 12:56:40.059013 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 12:56:40 crc kubenswrapper[4727]: I1001 12:56:40.071978 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.69612974 podStartE2EDuration="2.071962691s" podCreationTimestamp="2025-10-01 12:56:38 +0000 UTC" firstStartedPulling="2025-10-01 12:56:38.899330958 +0000 UTC m=+1177.220685795" lastFinishedPulling="2025-10-01 12:56:39.275163909 +0000 UTC m=+1177.596518746" observedRunningTime="2025-10-01 12:56:40.070707462 +0000 UTC m=+1178.392062319" watchObservedRunningTime="2025-10-01 12:56:40.071962691 +0000 UTC m=+1178.393317528" Oct 01 12:56:41 crc kubenswrapper[4727]: E1001 12:56:41.602055 4727 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27b755da_d064_4481_b856_4b51bb15cecb.slice/crio-ab26382c2d1a3ca68c36b50aeb8656829916d467a5c3db235041143e82fda2c6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27b755da_d064_4481_b856_4b51bb15cecb.slice\": RecentStats: unable to find data in memory cache]" Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.074580 4727 generic.go:334] "Generic (PLEG): container finished" podID="92a96edf-502b-4528-a73b-b7fb945d3d80" containerID="ff70d54505331bbe6815204a48ca6d5350a7249e39753e5c312aae6dbfd6fc70" exitCode=0 Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.074678 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92a96edf-502b-4528-a73b-b7fb945d3d80","Type":"ContainerDied","Data":"ff70d54505331bbe6815204a48ca6d5350a7249e39753e5c312aae6dbfd6fc70"} Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.337912 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.517047 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a96edf-502b-4528-a73b-b7fb945d3d80-config-data\") pod \"92a96edf-502b-4528-a73b-b7fb945d3d80\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.517127 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92a96edf-502b-4528-a73b-b7fb945d3d80-log-httpd\") pod \"92a96edf-502b-4528-a73b-b7fb945d3d80\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.517156 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njr8m\" (UniqueName: \"kubernetes.io/projected/92a96edf-502b-4528-a73b-b7fb945d3d80-kube-api-access-njr8m\") pod \"92a96edf-502b-4528-a73b-b7fb945d3d80\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.517174 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92a96edf-502b-4528-a73b-b7fb945d3d80-scripts\") pod \"92a96edf-502b-4528-a73b-b7fb945d3d80\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.517213 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92a96edf-502b-4528-a73b-b7fb945d3d80-sg-core-conf-yaml\") pod \"92a96edf-502b-4528-a73b-b7fb945d3d80\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.517246 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a96edf-502b-4528-a73b-b7fb945d3d80-combined-ca-bundle\") pod \"92a96edf-502b-4528-a73b-b7fb945d3d80\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.517296 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92a96edf-502b-4528-a73b-b7fb945d3d80-run-httpd\") pod \"92a96edf-502b-4528-a73b-b7fb945d3d80\" (UID: \"92a96edf-502b-4528-a73b-b7fb945d3d80\") " Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.518452 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92a96edf-502b-4528-a73b-b7fb945d3d80-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "92a96edf-502b-4528-a73b-b7fb945d3d80" (UID: "92a96edf-502b-4528-a73b-b7fb945d3d80"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.518738 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92a96edf-502b-4528-a73b-b7fb945d3d80-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "92a96edf-502b-4528-a73b-b7fb945d3d80" (UID: "92a96edf-502b-4528-a73b-b7fb945d3d80"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.523440 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a96edf-502b-4528-a73b-b7fb945d3d80-scripts" (OuterVolumeSpecName: "scripts") pod "92a96edf-502b-4528-a73b-b7fb945d3d80" (UID: "92a96edf-502b-4528-a73b-b7fb945d3d80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.537388 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92a96edf-502b-4528-a73b-b7fb945d3d80-kube-api-access-njr8m" (OuterVolumeSpecName: "kube-api-access-njr8m") pod "92a96edf-502b-4528-a73b-b7fb945d3d80" (UID: "92a96edf-502b-4528-a73b-b7fb945d3d80"). InnerVolumeSpecName "kube-api-access-njr8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.550621 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a96edf-502b-4528-a73b-b7fb945d3d80-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "92a96edf-502b-4528-a73b-b7fb945d3d80" (UID: "92a96edf-502b-4528-a73b-b7fb945d3d80"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.582413 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.583060 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.583767 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.590288 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.597225 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a96edf-502b-4528-a73b-b7fb945d3d80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92a96edf-502b-4528-a73b-b7fb945d3d80" (UID: "92a96edf-502b-4528-a73b-b7fb945d3d80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.624492 4727 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92a96edf-502b-4528-a73b-b7fb945d3d80-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.624535 4727 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/92a96edf-502b-4528-a73b-b7fb945d3d80-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.624545 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njr8m\" (UniqueName: \"kubernetes.io/projected/92a96edf-502b-4528-a73b-b7fb945d3d80-kube-api-access-njr8m\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.624555 4727 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92a96edf-502b-4528-a73b-b7fb945d3d80-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.624564 4727 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/92a96edf-502b-4528-a73b-b7fb945d3d80-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.624573 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a96edf-502b-4528-a73b-b7fb945d3d80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.628996 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a96edf-502b-4528-a73b-b7fb945d3d80-config-data" (OuterVolumeSpecName: "config-data") pod "92a96edf-502b-4528-a73b-b7fb945d3d80" (UID: "92a96edf-502b-4528-a73b-b7fb945d3d80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:56:42 crc kubenswrapper[4727]: I1001 12:56:42.726785 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a96edf-502b-4528-a73b-b7fb945d3d80-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.086452 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"92a96edf-502b-4528-a73b-b7fb945d3d80","Type":"ContainerDied","Data":"ec47f7cff295dc14ecbfdb166ee4877c535bb3e1587f840723f3dbe7f8362b73"} Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.086532 4727 scope.go:117] "RemoveContainer" containerID="6c48651d2332668b6e6ba84bdbb4eb8fc2a2a34743da42e5f5649d1c59591b28" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.086491 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.086879 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.097977 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.112726 4727 scope.go:117] "RemoveContainer" containerID="f3539bfedbe166f40542484e56d0b55c37e0660807fd3c31668356168a31d3c7" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.158199 4727 scope.go:117] "RemoveContainer" containerID="ff70d54505331bbe6815204a48ca6d5350a7249e39753e5c312aae6dbfd6fc70" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.159267 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.177480 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.194109 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:43 crc kubenswrapper[4727]: E1001 12:56:43.194580 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a96edf-502b-4528-a73b-b7fb945d3d80" containerName="ceilometer-central-agent" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.194592 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a96edf-502b-4528-a73b-b7fb945d3d80" containerName="ceilometer-central-agent" Oct 01 12:56:43 crc kubenswrapper[4727]: E1001 12:56:43.194600 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a96edf-502b-4528-a73b-b7fb945d3d80" containerName="sg-core" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.194607 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a96edf-502b-4528-a73b-b7fb945d3d80" containerName="sg-core" Oct 01 12:56:43 crc kubenswrapper[4727]: E1001 12:56:43.194646 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a96edf-502b-4528-a73b-b7fb945d3d80" containerName="proxy-httpd" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.194652 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a96edf-502b-4528-a73b-b7fb945d3d80" containerName="proxy-httpd" Oct 01 12:56:43 crc kubenswrapper[4727]: E1001 12:56:43.194664 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a96edf-502b-4528-a73b-b7fb945d3d80" containerName="ceilometer-notification-agent" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.194670 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a96edf-502b-4528-a73b-b7fb945d3d80" containerName="ceilometer-notification-agent" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.194834 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="92a96edf-502b-4528-a73b-b7fb945d3d80" containerName="sg-core" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.194851 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="92a96edf-502b-4528-a73b-b7fb945d3d80" containerName="ceilometer-notification-agent" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.194871 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="92a96edf-502b-4528-a73b-b7fb945d3d80" containerName="proxy-httpd" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.194887 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="92a96edf-502b-4528-a73b-b7fb945d3d80" containerName="ceilometer-central-agent" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.196231 4727 scope.go:117] "RemoveContainer" containerID="8a069a2b225cadd85feb1b4208eb9d85eb44cab68c20fedd3f32a9efc2e1577d" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.209856 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.212840 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.213120 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.213397 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.215525 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.242553 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fd48c01-403c-4528-86f1-96de985389e4-log-httpd\") pod \"ceilometer-0\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " pod="openstack/ceilometer-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.242628 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " pod="openstack/ceilometer-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.242683 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " pod="openstack/ceilometer-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.242751 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-config-data\") pod \"ceilometer-0\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " pod="openstack/ceilometer-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.242786 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fd48c01-403c-4528-86f1-96de985389e4-run-httpd\") pod \"ceilometer-0\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " pod="openstack/ceilometer-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.242824 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-scripts\") pod \"ceilometer-0\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " pod="openstack/ceilometer-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.242884 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x4ct\" (UniqueName: \"kubernetes.io/projected/5fd48c01-403c-4528-86f1-96de985389e4-kube-api-access-7x4ct\") pod \"ceilometer-0\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " pod="openstack/ceilometer-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.242958 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " pod="openstack/ceilometer-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.344248 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-config-data\") pod \"ceilometer-0\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " pod="openstack/ceilometer-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.344303 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fd48c01-403c-4528-86f1-96de985389e4-run-httpd\") pod \"ceilometer-0\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " pod="openstack/ceilometer-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.344325 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-scripts\") pod \"ceilometer-0\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " pod="openstack/ceilometer-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.344368 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x4ct\" (UniqueName: \"kubernetes.io/projected/5fd48c01-403c-4528-86f1-96de985389e4-kube-api-access-7x4ct\") pod \"ceilometer-0\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " pod="openstack/ceilometer-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.344413 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " pod="openstack/ceilometer-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.344439 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fd48c01-403c-4528-86f1-96de985389e4-log-httpd\") pod \"ceilometer-0\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " pod="openstack/ceilometer-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.344476 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " pod="openstack/ceilometer-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.344514 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " pod="openstack/ceilometer-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.345015 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fd48c01-403c-4528-86f1-96de985389e4-run-httpd\") pod \"ceilometer-0\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " pod="openstack/ceilometer-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.345256 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fd48c01-403c-4528-86f1-96de985389e4-log-httpd\") pod \"ceilometer-0\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " pod="openstack/ceilometer-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.351181 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-config-data\") pod \"ceilometer-0\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " pod="openstack/ceilometer-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.351777 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " pod="openstack/ceilometer-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.351820 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " pod="openstack/ceilometer-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.352528 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-scripts\") pod \"ceilometer-0\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " pod="openstack/ceilometer-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.358376 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " pod="openstack/ceilometer-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.361382 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x4ct\" (UniqueName: \"kubernetes.io/projected/5fd48c01-403c-4528-86f1-96de985389e4-kube-api-access-7x4ct\") pod \"ceilometer-0\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " pod="openstack/ceilometer-0" Oct 01 12:56:43 crc kubenswrapper[4727]: I1001 12:56:43.547490 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:56:44 crc kubenswrapper[4727]: I1001 12:56:44.016214 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:56:44 crc kubenswrapper[4727]: I1001 12:56:44.097308 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fd48c01-403c-4528-86f1-96de985389e4","Type":"ContainerStarted","Data":"78f60ae58f56a2feeae68700012afb2756ba3cb63986f39e1827e3c7a61cad09"} Oct 01 12:56:44 crc kubenswrapper[4727]: I1001 12:56:44.387309 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92a96edf-502b-4528-a73b-b7fb945d3d80" path="/var/lib/kubelet/pods/92a96edf-502b-4528-a73b-b7fb945d3d80/volumes" Oct 01 12:56:45 crc kubenswrapper[4727]: I1001 12:56:45.108627 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fd48c01-403c-4528-86f1-96de985389e4","Type":"ContainerStarted","Data":"f0594a1720f5b9103eb8f9404af6d25e540a192c5509edd4620a71dbc4020820"} Oct 01 12:56:47 crc kubenswrapper[4727]: I1001 12:56:47.150919 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fd48c01-403c-4528-86f1-96de985389e4","Type":"ContainerStarted","Data":"2771d4d6bc317242a2e3c83c1d51b13124d8c1d294b775da7354ac6f8ef13561"} Oct 01 12:56:47 crc kubenswrapper[4727]: I1001 12:56:47.486399 4727 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 12:56:48 crc kubenswrapper[4727]: I1001 12:56:48.159916 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fd48c01-403c-4528-86f1-96de985389e4","Type":"ContainerStarted","Data":"e42daa1d18b8aafed66c8d1a6b8f5c79eeb31d1734df6ee4f79060b16fe687a0"} Oct 01 12:56:48 crc kubenswrapper[4727]: I1001 12:56:48.418736 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 01 12:56:49 crc kubenswrapper[4727]: I1001 12:56:49.172284 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fd48c01-403c-4528-86f1-96de985389e4","Type":"ContainerStarted","Data":"44da6473d02ea04ea3a94a4457f7678db3d3de68799818aef51ab5b222edf8e9"} Oct 01 12:56:49 crc kubenswrapper[4727]: I1001 12:56:49.172783 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 12:56:49 crc kubenswrapper[4727]: I1001 12:56:49.210823 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.726290804 podStartE2EDuration="6.210802801s" podCreationTimestamp="2025-10-01 12:56:43 +0000 UTC" firstStartedPulling="2025-10-01 12:56:44.017788501 +0000 UTC m=+1182.339143338" lastFinishedPulling="2025-10-01 12:56:48.502300498 +0000 UTC m=+1186.823655335" observedRunningTime="2025-10-01 12:56:49.201189608 +0000 UTC m=+1187.522544465" watchObservedRunningTime="2025-10-01 12:56:49.210802801 +0000 UTC m=+1187.532157648" Oct 01 12:56:51 crc kubenswrapper[4727]: E1001 12:56:51.844851 4727 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27b755da_d064_4481_b856_4b51bb15cecb.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27b755da_d064_4481_b856_4b51bb15cecb.slice/crio-ab26382c2d1a3ca68c36b50aeb8656829916d467a5c3db235041143e82fda2c6\": RecentStats: unable to find data in memory cache]" Oct 01 12:57:02 crc kubenswrapper[4727]: E1001 12:57:02.090881 4727 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27b755da_d064_4481_b856_4b51bb15cecb.slice/crio-ab26382c2d1a3ca68c36b50aeb8656829916d467a5c3db235041143e82fda2c6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27b755da_d064_4481_b856_4b51bb15cecb.slice\": RecentStats: unable to find data in memory cache]" Oct 01 12:57:08 crc kubenswrapper[4727]: I1001 12:57:08.647453 4727 scope.go:117] "RemoveContainer" containerID="2dc43babd2366e63d5fdb7774109b17aa5f48c7096f0eca313b6049c5df43b39" Oct 01 12:57:08 crc kubenswrapper[4727]: I1001 12:57:08.672874 4727 scope.go:117] "RemoveContainer" containerID="5ba009bf984cb867ab10b7b8db87fb9aeb13766a56b156fea719d1dd2c08fe87" Oct 01 12:57:08 crc kubenswrapper[4727]: I1001 12:57:08.712718 4727 scope.go:117] "RemoveContainer" containerID="bad00c9b86d92fefc924e90b2ad8dc3ae59d0925c8ab4613c1cfcd9b1d2ec2cd" Oct 01 12:57:13 crc kubenswrapper[4727]: I1001 12:57:13.554073 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 01 12:57:26 crc kubenswrapper[4727]: I1001 12:57:26.103508 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:57:26 crc kubenswrapper[4727]: I1001 12:57:26.104431 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5fd48c01-403c-4528-86f1-96de985389e4" containerName="ceilometer-central-agent" containerID="cri-o://f0594a1720f5b9103eb8f9404af6d25e540a192c5509edd4620a71dbc4020820" gracePeriod=30 Oct 01 12:57:26 crc kubenswrapper[4727]: I1001 12:57:26.104510 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5fd48c01-403c-4528-86f1-96de985389e4" containerName="sg-core" containerID="cri-o://e42daa1d18b8aafed66c8d1a6b8f5c79eeb31d1734df6ee4f79060b16fe687a0" gracePeriod=30 Oct 01 12:57:26 crc kubenswrapper[4727]: I1001 12:57:26.104561 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5fd48c01-403c-4528-86f1-96de985389e4" containerName="ceilometer-notification-agent" containerID="cri-o://2771d4d6bc317242a2e3c83c1d51b13124d8c1d294b775da7354ac6f8ef13561" gracePeriod=30 Oct 01 12:57:26 crc kubenswrapper[4727]: I1001 12:57:26.104693 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5fd48c01-403c-4528-86f1-96de985389e4" containerName="proxy-httpd" containerID="cri-o://44da6473d02ea04ea3a94a4457f7678db3d3de68799818aef51ab5b222edf8e9" gracePeriod=30 Oct 01 12:57:26 crc kubenswrapper[4727]: I1001 12:57:26.545308 4727 generic.go:334] "Generic (PLEG): container finished" podID="5fd48c01-403c-4528-86f1-96de985389e4" containerID="44da6473d02ea04ea3a94a4457f7678db3d3de68799818aef51ab5b222edf8e9" exitCode=0 Oct 01 12:57:26 crc kubenswrapper[4727]: I1001 12:57:26.545605 4727 generic.go:334] "Generic (PLEG): container finished" podID="5fd48c01-403c-4528-86f1-96de985389e4" containerID="e42daa1d18b8aafed66c8d1a6b8f5c79eeb31d1734df6ee4f79060b16fe687a0" exitCode=2 Oct 01 12:57:26 crc kubenswrapper[4727]: I1001 12:57:26.545615 4727 generic.go:334] "Generic (PLEG): container finished" podID="5fd48c01-403c-4528-86f1-96de985389e4" containerID="f0594a1720f5b9103eb8f9404af6d25e540a192c5509edd4620a71dbc4020820" exitCode=0 Oct 01 12:57:26 crc kubenswrapper[4727]: I1001 12:57:26.545381 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fd48c01-403c-4528-86f1-96de985389e4","Type":"ContainerDied","Data":"44da6473d02ea04ea3a94a4457f7678db3d3de68799818aef51ab5b222edf8e9"} Oct 01 12:57:26 crc kubenswrapper[4727]: I1001 12:57:26.545648 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fd48c01-403c-4528-86f1-96de985389e4","Type":"ContainerDied","Data":"e42daa1d18b8aafed66c8d1a6b8f5c79eeb31d1734df6ee4f79060b16fe687a0"} Oct 01 12:57:26 crc kubenswrapper[4727]: I1001 12:57:26.545662 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fd48c01-403c-4528-86f1-96de985389e4","Type":"ContainerDied","Data":"f0594a1720f5b9103eb8f9404af6d25e540a192c5509edd4620a71dbc4020820"} Oct 01 12:57:26 crc kubenswrapper[4727]: I1001 12:57:26.917049 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 12:57:27 crc kubenswrapper[4727]: I1001 12:57:27.702905 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.242153 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.262246 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-ceilometer-tls-certs\") pod \"5fd48c01-403c-4528-86f1-96de985389e4\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.262316 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-combined-ca-bundle\") pod \"5fd48c01-403c-4528-86f1-96de985389e4\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.262414 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-config-data\") pod \"5fd48c01-403c-4528-86f1-96de985389e4\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.262495 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-sg-core-conf-yaml\") pod \"5fd48c01-403c-4528-86f1-96de985389e4\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.262587 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fd48c01-403c-4528-86f1-96de985389e4-run-httpd\") pod \"5fd48c01-403c-4528-86f1-96de985389e4\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.262639 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fd48c01-403c-4528-86f1-96de985389e4-log-httpd\") pod \"5fd48c01-403c-4528-86f1-96de985389e4\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.262660 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-scripts\") pod \"5fd48c01-403c-4528-86f1-96de985389e4\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.262707 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x4ct\" (UniqueName: \"kubernetes.io/projected/5fd48c01-403c-4528-86f1-96de985389e4-kube-api-access-7x4ct\") pod \"5fd48c01-403c-4528-86f1-96de985389e4\" (UID: \"5fd48c01-403c-4528-86f1-96de985389e4\") " Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.263220 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fd48c01-403c-4528-86f1-96de985389e4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5fd48c01-403c-4528-86f1-96de985389e4" (UID: "5fd48c01-403c-4528-86f1-96de985389e4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.263309 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fd48c01-403c-4528-86f1-96de985389e4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5fd48c01-403c-4528-86f1-96de985389e4" (UID: "5fd48c01-403c-4528-86f1-96de985389e4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.274428 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-scripts" (OuterVolumeSpecName: "scripts") pod "5fd48c01-403c-4528-86f1-96de985389e4" (UID: "5fd48c01-403c-4528-86f1-96de985389e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.303581 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fd48c01-403c-4528-86f1-96de985389e4-kube-api-access-7x4ct" (OuterVolumeSpecName: "kube-api-access-7x4ct") pod "5fd48c01-403c-4528-86f1-96de985389e4" (UID: "5fd48c01-403c-4528-86f1-96de985389e4"). InnerVolumeSpecName "kube-api-access-7x4ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.328082 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5fd48c01-403c-4528-86f1-96de985389e4" (UID: "5fd48c01-403c-4528-86f1-96de985389e4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.364021 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x4ct\" (UniqueName: \"kubernetes.io/projected/5fd48c01-403c-4528-86f1-96de985389e4-kube-api-access-7x4ct\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.364047 4727 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.364058 4727 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fd48c01-403c-4528-86f1-96de985389e4-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.364066 4727 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fd48c01-403c-4528-86f1-96de985389e4-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.364074 4727 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.389571 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5fd48c01-403c-4528-86f1-96de985389e4" (UID: "5fd48c01-403c-4528-86f1-96de985389e4"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.399613 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5fd48c01-403c-4528-86f1-96de985389e4" (UID: "5fd48c01-403c-4528-86f1-96de985389e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.412675 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-config-data" (OuterVolumeSpecName: "config-data") pod "5fd48c01-403c-4528-86f1-96de985389e4" (UID: "5fd48c01-403c-4528-86f1-96de985389e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.465937 4727 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.465965 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.465974 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fd48c01-403c-4528-86f1-96de985389e4-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.572166 4727 generic.go:334] "Generic (PLEG): container finished" podID="5fd48c01-403c-4528-86f1-96de985389e4" containerID="2771d4d6bc317242a2e3c83c1d51b13124d8c1d294b775da7354ac6f8ef13561" exitCode=0 Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.572223 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.572215 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fd48c01-403c-4528-86f1-96de985389e4","Type":"ContainerDied","Data":"2771d4d6bc317242a2e3c83c1d51b13124d8c1d294b775da7354ac6f8ef13561"} Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.572701 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fd48c01-403c-4528-86f1-96de985389e4","Type":"ContainerDied","Data":"78f60ae58f56a2feeae68700012afb2756ba3cb63986f39e1827e3c7a61cad09"} Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.572722 4727 scope.go:117] "RemoveContainer" containerID="44da6473d02ea04ea3a94a4457f7678db3d3de68799818aef51ab5b222edf8e9" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.596012 4727 scope.go:117] "RemoveContainer" containerID="e42daa1d18b8aafed66c8d1a6b8f5c79eeb31d1734df6ee4f79060b16fe687a0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.609794 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.621809 4727 scope.go:117] "RemoveContainer" containerID="2771d4d6bc317242a2e3c83c1d51b13124d8c1d294b775da7354ac6f8ef13561" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.623872 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.652874 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:57:29 crc kubenswrapper[4727]: E1001 12:57:29.653355 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fd48c01-403c-4528-86f1-96de985389e4" containerName="ceilometer-notification-agent" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.653381 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fd48c01-403c-4528-86f1-96de985389e4" containerName="ceilometer-notification-agent" Oct 01 12:57:29 crc kubenswrapper[4727]: E1001 12:57:29.653402 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fd48c01-403c-4528-86f1-96de985389e4" containerName="proxy-httpd" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.653411 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fd48c01-403c-4528-86f1-96de985389e4" containerName="proxy-httpd" Oct 01 12:57:29 crc kubenswrapper[4727]: E1001 12:57:29.653435 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fd48c01-403c-4528-86f1-96de985389e4" containerName="ceilometer-central-agent" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.653442 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fd48c01-403c-4528-86f1-96de985389e4" containerName="ceilometer-central-agent" Oct 01 12:57:29 crc kubenswrapper[4727]: E1001 12:57:29.653464 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fd48c01-403c-4528-86f1-96de985389e4" containerName="sg-core" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.653471 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fd48c01-403c-4528-86f1-96de985389e4" containerName="sg-core" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.653678 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fd48c01-403c-4528-86f1-96de985389e4" containerName="ceilometer-notification-agent" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.653708 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fd48c01-403c-4528-86f1-96de985389e4" containerName="ceilometer-central-agent" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.653723 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fd48c01-403c-4528-86f1-96de985389e4" containerName="proxy-httpd" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.653739 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fd48c01-403c-4528-86f1-96de985389e4" containerName="sg-core" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.653777 4727 scope.go:117] "RemoveContainer" containerID="f0594a1720f5b9103eb8f9404af6d25e540a192c5509edd4620a71dbc4020820" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.656053 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.660439 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.660625 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.660970 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.672123 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.674791 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-config-data\") pod \"ceilometer-0\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " pod="openstack/ceilometer-0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.674836 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-scripts\") pod \"ceilometer-0\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " pod="openstack/ceilometer-0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.674887 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f4fac25-c782-4f4c-ab50-62969ea1f369-log-httpd\") pod \"ceilometer-0\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " pod="openstack/ceilometer-0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.674930 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " pod="openstack/ceilometer-0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.674991 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f4fac25-c782-4f4c-ab50-62969ea1f369-run-httpd\") pod \"ceilometer-0\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " pod="openstack/ceilometer-0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.675040 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " pod="openstack/ceilometer-0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.675069 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " pod="openstack/ceilometer-0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.675146 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdpt6\" (UniqueName: \"kubernetes.io/projected/7f4fac25-c782-4f4c-ab50-62969ea1f369-kube-api-access-sdpt6\") pod \"ceilometer-0\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " pod="openstack/ceilometer-0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.704033 4727 scope.go:117] "RemoveContainer" containerID="44da6473d02ea04ea3a94a4457f7678db3d3de68799818aef51ab5b222edf8e9" Oct 01 12:57:29 crc kubenswrapper[4727]: E1001 12:57:29.708611 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44da6473d02ea04ea3a94a4457f7678db3d3de68799818aef51ab5b222edf8e9\": container with ID starting with 44da6473d02ea04ea3a94a4457f7678db3d3de68799818aef51ab5b222edf8e9 not found: ID does not exist" containerID="44da6473d02ea04ea3a94a4457f7678db3d3de68799818aef51ab5b222edf8e9" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.708646 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44da6473d02ea04ea3a94a4457f7678db3d3de68799818aef51ab5b222edf8e9"} err="failed to get container status \"44da6473d02ea04ea3a94a4457f7678db3d3de68799818aef51ab5b222edf8e9\": rpc error: code = NotFound desc = could not find container \"44da6473d02ea04ea3a94a4457f7678db3d3de68799818aef51ab5b222edf8e9\": container with ID starting with 44da6473d02ea04ea3a94a4457f7678db3d3de68799818aef51ab5b222edf8e9 not found: ID does not exist" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.708670 4727 scope.go:117] "RemoveContainer" containerID="e42daa1d18b8aafed66c8d1a6b8f5c79eeb31d1734df6ee4f79060b16fe687a0" Oct 01 12:57:29 crc kubenswrapper[4727]: E1001 12:57:29.709147 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e42daa1d18b8aafed66c8d1a6b8f5c79eeb31d1734df6ee4f79060b16fe687a0\": container with ID starting with e42daa1d18b8aafed66c8d1a6b8f5c79eeb31d1734df6ee4f79060b16fe687a0 not found: ID does not exist" containerID="e42daa1d18b8aafed66c8d1a6b8f5c79eeb31d1734df6ee4f79060b16fe687a0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.709164 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e42daa1d18b8aafed66c8d1a6b8f5c79eeb31d1734df6ee4f79060b16fe687a0"} err="failed to get container status \"e42daa1d18b8aafed66c8d1a6b8f5c79eeb31d1734df6ee4f79060b16fe687a0\": rpc error: code = NotFound desc = could not find container \"e42daa1d18b8aafed66c8d1a6b8f5c79eeb31d1734df6ee4f79060b16fe687a0\": container with ID starting with e42daa1d18b8aafed66c8d1a6b8f5c79eeb31d1734df6ee4f79060b16fe687a0 not found: ID does not exist" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.709176 4727 scope.go:117] "RemoveContainer" containerID="2771d4d6bc317242a2e3c83c1d51b13124d8c1d294b775da7354ac6f8ef13561" Oct 01 12:57:29 crc kubenswrapper[4727]: E1001 12:57:29.709512 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2771d4d6bc317242a2e3c83c1d51b13124d8c1d294b775da7354ac6f8ef13561\": container with ID starting with 2771d4d6bc317242a2e3c83c1d51b13124d8c1d294b775da7354ac6f8ef13561 not found: ID does not exist" containerID="2771d4d6bc317242a2e3c83c1d51b13124d8c1d294b775da7354ac6f8ef13561" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.709531 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2771d4d6bc317242a2e3c83c1d51b13124d8c1d294b775da7354ac6f8ef13561"} err="failed to get container status \"2771d4d6bc317242a2e3c83c1d51b13124d8c1d294b775da7354ac6f8ef13561\": rpc error: code = NotFound desc = could not find container \"2771d4d6bc317242a2e3c83c1d51b13124d8c1d294b775da7354ac6f8ef13561\": container with ID starting with 2771d4d6bc317242a2e3c83c1d51b13124d8c1d294b775da7354ac6f8ef13561 not found: ID does not exist" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.709544 4727 scope.go:117] "RemoveContainer" containerID="f0594a1720f5b9103eb8f9404af6d25e540a192c5509edd4620a71dbc4020820" Oct 01 12:57:29 crc kubenswrapper[4727]: E1001 12:57:29.709734 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0594a1720f5b9103eb8f9404af6d25e540a192c5509edd4620a71dbc4020820\": container with ID starting with f0594a1720f5b9103eb8f9404af6d25e540a192c5509edd4620a71dbc4020820 not found: ID does not exist" containerID="f0594a1720f5b9103eb8f9404af6d25e540a192c5509edd4620a71dbc4020820" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.709753 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0594a1720f5b9103eb8f9404af6d25e540a192c5509edd4620a71dbc4020820"} err="failed to get container status \"f0594a1720f5b9103eb8f9404af6d25e540a192c5509edd4620a71dbc4020820\": rpc error: code = NotFound desc = could not find container \"f0594a1720f5b9103eb8f9404af6d25e540a192c5509edd4620a71dbc4020820\": container with ID starting with f0594a1720f5b9103eb8f9404af6d25e540a192c5509edd4620a71dbc4020820 not found: ID does not exist" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.776740 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f4fac25-c782-4f4c-ab50-62969ea1f369-run-httpd\") pod \"ceilometer-0\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " pod="openstack/ceilometer-0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.776810 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " pod="openstack/ceilometer-0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.776842 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " pod="openstack/ceilometer-0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.776916 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdpt6\" (UniqueName: \"kubernetes.io/projected/7f4fac25-c782-4f4c-ab50-62969ea1f369-kube-api-access-sdpt6\") pod \"ceilometer-0\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " pod="openstack/ceilometer-0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.776974 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-config-data\") pod \"ceilometer-0\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " pod="openstack/ceilometer-0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.777023 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-scripts\") pod \"ceilometer-0\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " pod="openstack/ceilometer-0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.777069 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f4fac25-c782-4f4c-ab50-62969ea1f369-log-httpd\") pod \"ceilometer-0\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " pod="openstack/ceilometer-0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.777109 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " pod="openstack/ceilometer-0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.778397 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f4fac25-c782-4f4c-ab50-62969ea1f369-log-httpd\") pod \"ceilometer-0\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " pod="openstack/ceilometer-0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.778220 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f4fac25-c782-4f4c-ab50-62969ea1f369-run-httpd\") pod \"ceilometer-0\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " pod="openstack/ceilometer-0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.782357 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " pod="openstack/ceilometer-0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.782365 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-scripts\") pod \"ceilometer-0\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " pod="openstack/ceilometer-0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.782805 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " pod="openstack/ceilometer-0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.782941 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-config-data\") pod \"ceilometer-0\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " pod="openstack/ceilometer-0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.783244 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " pod="openstack/ceilometer-0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.795743 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdpt6\" (UniqueName: \"kubernetes.io/projected/7f4fac25-c782-4f4c-ab50-62969ea1f369-kube-api-access-sdpt6\") pod \"ceilometer-0\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " pod="openstack/ceilometer-0" Oct 01 12:57:29 crc kubenswrapper[4727]: I1001 12:57:29.984160 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 12:57:30 crc kubenswrapper[4727]: I1001 12:57:30.384789 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fd48c01-403c-4528-86f1-96de985389e4" path="/var/lib/kubelet/pods/5fd48c01-403c-4528-86f1-96de985389e4/volumes" Oct 01 12:57:30 crc kubenswrapper[4727]: I1001 12:57:30.449416 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 12:57:30 crc kubenswrapper[4727]: I1001 12:57:30.587211 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f4fac25-c782-4f4c-ab50-62969ea1f369","Type":"ContainerStarted","Data":"873bb98802845f78b22d9e2886a79aec9ddd7a17f018e3f2ad02a9a67c6c1414"} Oct 01 12:57:31 crc kubenswrapper[4727]: I1001 12:57:31.292598 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="74ad068e-3c83-4fd2-af0a-7e45cd945411" containerName="rabbitmq" containerID="cri-o://8a81300adaa1a79c48be60f260d6b2c72adcabfd8abff87258b84d4734083da2" gracePeriod=604796 Oct 01 12:57:32 crc kubenswrapper[4727]: I1001 12:57:32.162659 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="42c8d9a9-fa0f-44c5-9ac1-2361f24c0876" containerName="rabbitmq" containerID="cri-o://d4be881e32be1bbc9d14d2cf1ac2c6bd783bdb664cb9a3a02a4b98b201bc72ec" gracePeriod=604796 Oct 01 12:57:37 crc kubenswrapper[4727]: I1001 12:57:37.614412 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="74ad068e-3c83-4fd2-af0a-7e45cd945411" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Oct 01 12:57:37 crc kubenswrapper[4727]: I1001 12:57:37.658924 4727 generic.go:334] "Generic (PLEG): container finished" podID="74ad068e-3c83-4fd2-af0a-7e45cd945411" containerID="8a81300adaa1a79c48be60f260d6b2c72adcabfd8abff87258b84d4734083da2" exitCode=0 Oct 01 12:57:37 crc kubenswrapper[4727]: I1001 12:57:37.658991 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74ad068e-3c83-4fd2-af0a-7e45cd945411","Type":"ContainerDied","Data":"8a81300adaa1a79c48be60f260d6b2c72adcabfd8abff87258b84d4734083da2"} Oct 01 12:57:38 crc kubenswrapper[4727]: I1001 12:57:38.081540 4727 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="42c8d9a9-fa0f-44c5-9ac1-2361f24c0876" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Oct 01 12:57:38 crc kubenswrapper[4727]: I1001 12:57:38.669877 4727 generic.go:334] "Generic (PLEG): container finished" podID="42c8d9a9-fa0f-44c5-9ac1-2361f24c0876" containerID="d4be881e32be1bbc9d14d2cf1ac2c6bd783bdb664cb9a3a02a4b98b201bc72ec" exitCode=0 Oct 01 12:57:38 crc kubenswrapper[4727]: I1001 12:57:38.669959 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876","Type":"ContainerDied","Data":"d4be881e32be1bbc9d14d2cf1ac2c6bd783bdb664cb9a3a02a4b98b201bc72ec"} Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.491229 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-hcj6p"] Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.496114 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.501431 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.503592 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-hcj6p"] Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.582232 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.588348 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.631362 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-hcj6p\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.631447 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fjs9\" (UniqueName: \"kubernetes.io/projected/e19b79e9-2913-42e3-9257-b2750475ace3-kube-api-access-6fjs9\") pod \"dnsmasq-dns-67b789f86c-hcj6p\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.631476 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-hcj6p\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.631520 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-hcj6p\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.631540 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-hcj6p\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.631568 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-dns-svc\") pod \"dnsmasq-dns-67b789f86c-hcj6p\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.631681 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-config\") pod \"dnsmasq-dns-67b789f86c-hcj6p\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.697530 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"74ad068e-3c83-4fd2-af0a-7e45cd945411","Type":"ContainerDied","Data":"88e8b697b422e778535380bfe388b3ac97fcfdf6805ac9932a8be3297366f19b"} Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.697581 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.697590 4727 scope.go:117] "RemoveContainer" containerID="8a81300adaa1a79c48be60f260d6b2c72adcabfd8abff87258b84d4734083da2" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.709439 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876","Type":"ContainerDied","Data":"7507e5db8e1365746122c603727e5ff2587d837e674590d8264e66d621fb1866"} Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.709522 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.732830 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"74ad068e-3c83-4fd2-af0a-7e45cd945411\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.732939 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-plugins-conf\") pod \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.732991 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/74ad068e-3c83-4fd2-af0a-7e45cd945411-rabbitmq-plugins\") pod \"74ad068e-3c83-4fd2-af0a-7e45cd945411\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.733096 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-rabbitmq-plugins\") pod \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.733148 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-rabbitmq-tls\") pod \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.733183 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/74ad068e-3c83-4fd2-af0a-7e45cd945411-rabbitmq-erlang-cookie\") pod \"74ad068e-3c83-4fd2-af0a-7e45cd945411\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.733229 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxjwx\" (UniqueName: \"kubernetes.io/projected/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-kube-api-access-jxjwx\") pod \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.733252 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-erlang-cookie-secret\") pod \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.733282 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/74ad068e-3c83-4fd2-af0a-7e45cd945411-pod-info\") pod \"74ad068e-3c83-4fd2-af0a-7e45cd945411\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.733318 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74ad068e-3c83-4fd2-af0a-7e45cd945411-config-data\") pod \"74ad068e-3c83-4fd2-af0a-7e45cd945411\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.733373 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-rabbitmq-erlang-cookie\") pod \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.733399 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-server-conf\") pod \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.733428 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-pod-info\") pod \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.733471 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-rabbitmq-confd\") pod \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.733505 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/74ad068e-3c83-4fd2-af0a-7e45cd945411-erlang-cookie-secret\") pod \"74ad068e-3c83-4fd2-af0a-7e45cd945411\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.733525 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.733579 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/74ad068e-3c83-4fd2-af0a-7e45cd945411-plugins-conf\") pod \"74ad068e-3c83-4fd2-af0a-7e45cd945411\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.733602 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/74ad068e-3c83-4fd2-af0a-7e45cd945411-rabbitmq-confd\") pod \"74ad068e-3c83-4fd2-af0a-7e45cd945411\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.733652 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phpnd\" (UniqueName: \"kubernetes.io/projected/74ad068e-3c83-4fd2-af0a-7e45cd945411-kube-api-access-phpnd\") pod \"74ad068e-3c83-4fd2-af0a-7e45cd945411\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.733701 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-config-data\") pod \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\" (UID: \"42c8d9a9-fa0f-44c5-9ac1-2361f24c0876\") " Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.733733 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/74ad068e-3c83-4fd2-af0a-7e45cd945411-rabbitmq-tls\") pod \"74ad068e-3c83-4fd2-af0a-7e45cd945411\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.733776 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/74ad068e-3c83-4fd2-af0a-7e45cd945411-server-conf\") pod \"74ad068e-3c83-4fd2-af0a-7e45cd945411\" (UID: \"74ad068e-3c83-4fd2-af0a-7e45cd945411\") " Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.734093 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-config\") pod \"dnsmasq-dns-67b789f86c-hcj6p\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.734176 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "42c8d9a9-fa0f-44c5-9ac1-2361f24c0876" (UID: "42c8d9a9-fa0f-44c5-9ac1-2361f24c0876"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.734204 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-hcj6p\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.734379 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fjs9\" (UniqueName: \"kubernetes.io/projected/e19b79e9-2913-42e3-9257-b2750475ace3-kube-api-access-6fjs9\") pod \"dnsmasq-dns-67b789f86c-hcj6p\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.734405 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-hcj6p\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.734443 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-hcj6p\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.734460 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-hcj6p\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.734485 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-dns-svc\") pod \"dnsmasq-dns-67b789f86c-hcj6p\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.734608 4727 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.734631 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74ad068e-3c83-4fd2-af0a-7e45cd945411-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "74ad068e-3c83-4fd2-af0a-7e45cd945411" (UID: "74ad068e-3c83-4fd2-af0a-7e45cd945411"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.735474 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "42c8d9a9-fa0f-44c5-9ac1-2361f24c0876" (UID: "42c8d9a9-fa0f-44c5-9ac1-2361f24c0876"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.735977 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-hcj6p\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.736081 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-config\") pod \"dnsmasq-dns-67b789f86c-hcj6p\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.736561 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-hcj6p\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.737356 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74ad068e-3c83-4fd2-af0a-7e45cd945411-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "74ad068e-3c83-4fd2-af0a-7e45cd945411" (UID: "74ad068e-3c83-4fd2-af0a-7e45cd945411"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.737296 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-hcj6p\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.738096 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-hcj6p\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.742143 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74ad068e-3c83-4fd2-af0a-7e45cd945411-kube-api-access-phpnd" (OuterVolumeSpecName: "kube-api-access-phpnd") pod "74ad068e-3c83-4fd2-af0a-7e45cd945411" (UID: "74ad068e-3c83-4fd2-af0a-7e45cd945411"). InnerVolumeSpecName "kube-api-access-phpnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.742802 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-dns-svc\") pod \"dnsmasq-dns-67b789f86c-hcj6p\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.743453 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74ad068e-3c83-4fd2-af0a-7e45cd945411-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "74ad068e-3c83-4fd2-af0a-7e45cd945411" (UID: "74ad068e-3c83-4fd2-af0a-7e45cd945411"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.744470 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "42c8d9a9-fa0f-44c5-9ac1-2361f24c0876" (UID: "42c8d9a9-fa0f-44c5-9ac1-2361f24c0876"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.744558 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "74ad068e-3c83-4fd2-af0a-7e45cd945411" (UID: "74ad068e-3c83-4fd2-af0a-7e45cd945411"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.745930 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/74ad068e-3c83-4fd2-af0a-7e45cd945411-pod-info" (OuterVolumeSpecName: "pod-info") pod "74ad068e-3c83-4fd2-af0a-7e45cd945411" (UID: "74ad068e-3c83-4fd2-af0a-7e45cd945411"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.745965 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "42c8d9a9-fa0f-44c5-9ac1-2361f24c0876" (UID: "42c8d9a9-fa0f-44c5-9ac1-2361f24c0876"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.750448 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "42c8d9a9-fa0f-44c5-9ac1-2361f24c0876" (UID: "42c8d9a9-fa0f-44c5-9ac1-2361f24c0876"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.750670 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74ad068e-3c83-4fd2-af0a-7e45cd945411-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "74ad068e-3c83-4fd2-af0a-7e45cd945411" (UID: "74ad068e-3c83-4fd2-af0a-7e45cd945411"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.757958 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-pod-info" (OuterVolumeSpecName: "pod-info") pod "42c8d9a9-fa0f-44c5-9ac1-2361f24c0876" (UID: "42c8d9a9-fa0f-44c5-9ac1-2361f24c0876"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.760355 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "42c8d9a9-fa0f-44c5-9ac1-2361f24c0876" (UID: "42c8d9a9-fa0f-44c5-9ac1-2361f24c0876"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.761756 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74ad068e-3c83-4fd2-af0a-7e45cd945411-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "74ad068e-3c83-4fd2-af0a-7e45cd945411" (UID: "74ad068e-3c83-4fd2-af0a-7e45cd945411"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.782330 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-kube-api-access-jxjwx" (OuterVolumeSpecName: "kube-api-access-jxjwx") pod "42c8d9a9-fa0f-44c5-9ac1-2361f24c0876" (UID: "42c8d9a9-fa0f-44c5-9ac1-2361f24c0876"). InnerVolumeSpecName "kube-api-access-jxjwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.790604 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fjs9\" (UniqueName: \"kubernetes.io/projected/e19b79e9-2913-42e3-9257-b2750475ace3-kube-api-access-6fjs9\") pod \"dnsmasq-dns-67b789f86c-hcj6p\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.803132 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-config-data" (OuterVolumeSpecName: "config-data") pod "42c8d9a9-fa0f-44c5-9ac1-2361f24c0876" (UID: "42c8d9a9-fa0f-44c5-9ac1-2361f24c0876"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.842123 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phpnd\" (UniqueName: \"kubernetes.io/projected/74ad068e-3c83-4fd2-af0a-7e45cd945411-kube-api-access-phpnd\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.842155 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.842167 4727 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/74ad068e-3c83-4fd2-af0a-7e45cd945411-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.842193 4727 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.842203 4727 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.842211 4727 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/74ad068e-3c83-4fd2-af0a-7e45cd945411-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.842219 4727 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.842227 4727 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/74ad068e-3c83-4fd2-af0a-7e45cd945411-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.842236 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxjwx\" (UniqueName: \"kubernetes.io/projected/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-kube-api-access-jxjwx\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.842244 4727 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.842256 4727 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/74ad068e-3c83-4fd2-af0a-7e45cd945411-pod-info\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.842264 4727 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.842272 4727 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-pod-info\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.842285 4727 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.842293 4727 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/74ad068e-3c83-4fd2-af0a-7e45cd945411-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.842302 4727 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/74ad068e-3c83-4fd2-af0a-7e45cd945411-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.858319 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-server-conf" (OuterVolumeSpecName: "server-conf") pod "42c8d9a9-fa0f-44c5-9ac1-2361f24c0876" (UID: "42c8d9a9-fa0f-44c5-9ac1-2361f24c0876"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.860589 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74ad068e-3c83-4fd2-af0a-7e45cd945411-config-data" (OuterVolumeSpecName: "config-data") pod "74ad068e-3c83-4fd2-af0a-7e45cd945411" (UID: "74ad068e-3c83-4fd2-af0a-7e45cd945411"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.874771 4727 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.880356 4727 scope.go:117] "RemoveContainer" containerID="f86c783bf5387b32ebbd44f58fac153a5bb6b813b9d45eab24f2dd3b42107af2" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.881288 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74ad068e-3c83-4fd2-af0a-7e45cd945411-server-conf" (OuterVolumeSpecName: "server-conf") pod "74ad068e-3c83-4fd2-af0a-7e45cd945411" (UID: "74ad068e-3c83-4fd2-af0a-7e45cd945411"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.884492 4727 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.926535 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.943716 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74ad068e-3c83-4fd2-af0a-7e45cd945411-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.943910 4727 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-server-conf\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.943982 4727 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.944078 4727 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/74ad068e-3c83-4fd2-af0a-7e45cd945411-server-conf\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.944140 4727 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.947479 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74ad068e-3c83-4fd2-af0a-7e45cd945411-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "74ad068e-3c83-4fd2-af0a-7e45cd945411" (UID: "74ad068e-3c83-4fd2-af0a-7e45cd945411"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:57:40 crc kubenswrapper[4727]: I1001 12:57:40.973207 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "42c8d9a9-fa0f-44c5-9ac1-2361f24c0876" (UID: "42c8d9a9-fa0f-44c5-9ac1-2361f24c0876"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.048258 4727 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.048306 4727 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/74ad068e-3c83-4fd2-af0a-7e45cd945411-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.096430 4727 scope.go:117] "RemoveContainer" containerID="d4be881e32be1bbc9d14d2cf1ac2c6bd783bdb664cb9a3a02a4b98b201bc72ec" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.160056 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.179363 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.193121 4727 scope.go:117] "RemoveContainer" containerID="cf1055988e7ddf033939cb0af9c823e7afdae43301f7bffa7308bfce6e4ca110" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.193374 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.205378 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.214153 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 12:57:41 crc kubenswrapper[4727]: E1001 12:57:41.214630 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74ad068e-3c83-4fd2-af0a-7e45cd945411" containerName="setup-container" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.214648 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="74ad068e-3c83-4fd2-af0a-7e45cd945411" containerName="setup-container" Oct 01 12:57:41 crc kubenswrapper[4727]: E1001 12:57:41.214675 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c8d9a9-fa0f-44c5-9ac1-2361f24c0876" containerName="setup-container" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.214684 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c8d9a9-fa0f-44c5-9ac1-2361f24c0876" containerName="setup-container" Oct 01 12:57:41 crc kubenswrapper[4727]: E1001 12:57:41.214703 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c8d9a9-fa0f-44c5-9ac1-2361f24c0876" containerName="rabbitmq" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.214710 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c8d9a9-fa0f-44c5-9ac1-2361f24c0876" containerName="rabbitmq" Oct 01 12:57:41 crc kubenswrapper[4727]: E1001 12:57:41.214736 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74ad068e-3c83-4fd2-af0a-7e45cd945411" containerName="rabbitmq" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.214745 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="74ad068e-3c83-4fd2-af0a-7e45cd945411" containerName="rabbitmq" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.215026 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="74ad068e-3c83-4fd2-af0a-7e45cd945411" containerName="rabbitmq" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.215052 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c8d9a9-fa0f-44c5-9ac1-2361f24c0876" containerName="rabbitmq" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.216287 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.221021 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.221721 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.221931 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-765c7" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.222086 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.222226 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.222489 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.222593 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.234842 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.249323 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.262428 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.262542 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.266587 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.266601 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.267067 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.267135 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.267315 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.267474 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-f7tcr" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.267862 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.353730 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-config-data\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.353809 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.353937 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.353971 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.354022 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.354048 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.354072 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.354148 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhcng\" (UniqueName: \"kubernetes.io/projected/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-kube-api-access-xhcng\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.354198 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.354231 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.354274 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.455808 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.455867 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cd3bde15-3916-4632-97e6-50a7a6d2c60f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.455911 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cd3bde15-3916-4632-97e6-50a7a6d2c60f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.455945 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cd3bde15-3916-4632-97e6-50a7a6d2c60f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.456092 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cd3bde15-3916-4632-97e6-50a7a6d2c60f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.456124 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cd3bde15-3916-4632-97e6-50a7a6d2c60f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.456166 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd7fx\" (UniqueName: \"kubernetes.io/projected/cd3bde15-3916-4632-97e6-50a7a6d2c60f-kube-api-access-rd7fx\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.456204 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.456237 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.456263 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.456288 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.456315 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.456338 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cd3bde15-3916-4632-97e6-50a7a6d2c60f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.456365 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhcng\" (UniqueName: \"kubernetes.io/projected/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-kube-api-access-xhcng\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.456401 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.456433 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.456462 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.456506 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cd3bde15-3916-4632-97e6-50a7a6d2c60f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.456528 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.456558 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-config-data\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.456590 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cd3bde15-3916-4632-97e6-50a7a6d2c60f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.456610 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cd3bde15-3916-4632-97e6-50a7a6d2c60f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.457386 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.457542 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.457735 4727 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.457936 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-config-data\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.459826 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.460185 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.465051 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.465437 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.465777 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.467553 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.478302 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-hcj6p"] Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.478934 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhcng\" (UniqueName: \"kubernetes.io/projected/4ed96f0b-b8d7-47f1-aa9c-3af04e140681-kube-api-access-xhcng\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.497856 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"4ed96f0b-b8d7-47f1-aa9c-3af04e140681\") " pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.557840 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cd3bde15-3916-4632-97e6-50a7a6d2c60f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.557901 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cd3bde15-3916-4632-97e6-50a7a6d2c60f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.557945 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd7fx\" (UniqueName: \"kubernetes.io/projected/cd3bde15-3916-4632-97e6-50a7a6d2c60f-kube-api-access-rd7fx\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.558026 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cd3bde15-3916-4632-97e6-50a7a6d2c60f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.558070 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.558130 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cd3bde15-3916-4632-97e6-50a7a6d2c60f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.558210 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cd3bde15-3916-4632-97e6-50a7a6d2c60f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.558239 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cd3bde15-3916-4632-97e6-50a7a6d2c60f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.558270 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cd3bde15-3916-4632-97e6-50a7a6d2c60f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.558311 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cd3bde15-3916-4632-97e6-50a7a6d2c60f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.558344 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cd3bde15-3916-4632-97e6-50a7a6d2c60f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.558839 4727 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.559461 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cd3bde15-3916-4632-97e6-50a7a6d2c60f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.559494 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cd3bde15-3916-4632-97e6-50a7a6d2c60f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.559845 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cd3bde15-3916-4632-97e6-50a7a6d2c60f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.559869 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cd3bde15-3916-4632-97e6-50a7a6d2c60f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.559880 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cd3bde15-3916-4632-97e6-50a7a6d2c60f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.562427 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cd3bde15-3916-4632-97e6-50a7a6d2c60f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.563310 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cd3bde15-3916-4632-97e6-50a7a6d2c60f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.563384 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cd3bde15-3916-4632-97e6-50a7a6d2c60f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.572385 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cd3bde15-3916-4632-97e6-50a7a6d2c60f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.580569 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.590396 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd7fx\" (UniqueName: \"kubernetes.io/projected/cd3bde15-3916-4632-97e6-50a7a6d2c60f-kube-api-access-rd7fx\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.602585 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cd3bde15-3916-4632-97e6-50a7a6d2c60f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.734559 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" event={"ID":"e19b79e9-2913-42e3-9257-b2750475ace3","Type":"ContainerStarted","Data":"563afdd8ffad3ca310f8579bbc461b2ed2209a251299748fa59749bfa7519cec"} Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.735182 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" event={"ID":"e19b79e9-2913-42e3-9257-b2750475ace3","Type":"ContainerStarted","Data":"ccecdc02f750c6a75fc26df32cc08fafddad6a99c27fdd9e2963edc42687ff49"} Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.764193 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f4fac25-c782-4f4c-ab50-62969ea1f369","Type":"ContainerStarted","Data":"63dc9bc5f7734cbdfca6a4fc141e49feea3354c4b446f4d0ca4b5712359ba399"} Oct 01 12:57:41 crc kubenswrapper[4727]: I1001 12:57:41.895030 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:57:42 crc kubenswrapper[4727]: I1001 12:57:42.103145 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 12:57:42 crc kubenswrapper[4727]: W1001 12:57:42.127027 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ed96f0b_b8d7_47f1_aa9c_3af04e140681.slice/crio-eb2f7f1d035946bd71b761f3f755bdcb709682300d88cc49f4bd49f43d94bc9f WatchSource:0}: Error finding container eb2f7f1d035946bd71b761f3f755bdcb709682300d88cc49f4bd49f43d94bc9f: Status 404 returned error can't find the container with id eb2f7f1d035946bd71b761f3f755bdcb709682300d88cc49f4bd49f43d94bc9f Oct 01 12:57:42 crc kubenswrapper[4727]: I1001 12:57:42.195488 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 12:57:42 crc kubenswrapper[4727]: I1001 12:57:42.384089 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42c8d9a9-fa0f-44c5-9ac1-2361f24c0876" path="/var/lib/kubelet/pods/42c8d9a9-fa0f-44c5-9ac1-2361f24c0876/volumes" Oct 01 12:57:42 crc kubenswrapper[4727]: I1001 12:57:42.385114 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74ad068e-3c83-4fd2-af0a-7e45cd945411" path="/var/lib/kubelet/pods/74ad068e-3c83-4fd2-af0a-7e45cd945411/volumes" Oct 01 12:57:42 crc kubenswrapper[4727]: I1001 12:57:42.775959 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f4fac25-c782-4f4c-ab50-62969ea1f369","Type":"ContainerStarted","Data":"e56268bdcc5fec5c8ab83a474aee84267e4ed72ce3a0fe6f4b4f95ede11cbd63"} Oct 01 12:57:42 crc kubenswrapper[4727]: I1001 12:57:42.777660 4727 generic.go:334] "Generic (PLEG): container finished" podID="e19b79e9-2913-42e3-9257-b2750475ace3" containerID="563afdd8ffad3ca310f8579bbc461b2ed2209a251299748fa59749bfa7519cec" exitCode=0 Oct 01 12:57:42 crc kubenswrapper[4727]: I1001 12:57:42.777703 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" event={"ID":"e19b79e9-2913-42e3-9257-b2750475ace3","Type":"ContainerDied","Data":"563afdd8ffad3ca310f8579bbc461b2ed2209a251299748fa59749bfa7519cec"} Oct 01 12:57:42 crc kubenswrapper[4727]: I1001 12:57:42.783755 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cd3bde15-3916-4632-97e6-50a7a6d2c60f","Type":"ContainerStarted","Data":"d7dad4bdccfc11c3bd62ee75991c2d2a8e5a82fa82a704d1ea0b8764f9ccc334"} Oct 01 12:57:42 crc kubenswrapper[4727]: I1001 12:57:42.785386 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4ed96f0b-b8d7-47f1-aa9c-3af04e140681","Type":"ContainerStarted","Data":"eb2f7f1d035946bd71b761f3f755bdcb709682300d88cc49f4bd49f43d94bc9f"} Oct 01 12:57:43 crc kubenswrapper[4727]: I1001 12:57:43.799267 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4ed96f0b-b8d7-47f1-aa9c-3af04e140681","Type":"ContainerStarted","Data":"f96af944bd637b462be21e27b7715f1c10301e8b70ed2a6505b40740e96acb28"} Oct 01 12:57:43 crc kubenswrapper[4727]: I1001 12:57:43.803623 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f4fac25-c782-4f4c-ab50-62969ea1f369","Type":"ContainerStarted","Data":"f4923fa89c2267ae488b8d88230846a1379770af16007d56e8cc20a9ecff0ee3"} Oct 01 12:57:43 crc kubenswrapper[4727]: I1001 12:57:43.817155 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" event={"ID":"e19b79e9-2913-42e3-9257-b2750475ace3","Type":"ContainerStarted","Data":"caabaacb4f936f97251ae8d7461306a7c895a2418f40a12965e723fecf38b167"} Oct 01 12:57:43 crc kubenswrapper[4727]: I1001 12:57:43.819840 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" Oct 01 12:57:43 crc kubenswrapper[4727]: I1001 12:57:43.832492 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cd3bde15-3916-4632-97e6-50a7a6d2c60f","Type":"ContainerStarted","Data":"22fba6d0a76c31599f66112952fea22450c9fe0b0311aec0d8ee8f929fbfa62a"} Oct 01 12:57:43 crc kubenswrapper[4727]: I1001 12:57:43.849634 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" podStartSLOduration=3.849612385 podStartE2EDuration="3.849612385s" podCreationTimestamp="2025-10-01 12:57:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:57:43.845033712 +0000 UTC m=+1242.166388559" watchObservedRunningTime="2025-10-01 12:57:43.849612385 +0000 UTC m=+1242.170967222" Oct 01 12:57:44 crc kubenswrapper[4727]: I1001 12:57:44.845311 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f4fac25-c782-4f4c-ab50-62969ea1f369","Type":"ContainerStarted","Data":"944afde0e103fc73e7a47ed130e684d0c440d8e628f93c3b732a5a221fd4c011"} Oct 01 12:57:44 crc kubenswrapper[4727]: I1001 12:57:44.845409 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 12:57:44 crc kubenswrapper[4727]: I1001 12:57:44.867180 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.99724141 podStartE2EDuration="15.86716305s" podCreationTimestamp="2025-10-01 12:57:29 +0000 UTC" firstStartedPulling="2025-10-01 12:57:30.464378069 +0000 UTC m=+1228.785732906" lastFinishedPulling="2025-10-01 12:57:44.334299709 +0000 UTC m=+1242.655654546" observedRunningTime="2025-10-01 12:57:44.865517598 +0000 UTC m=+1243.186872445" watchObservedRunningTime="2025-10-01 12:57:44.86716305 +0000 UTC m=+1243.188517887" Oct 01 12:57:50 crc kubenswrapper[4727]: I1001 12:57:50.928810 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" Oct 01 12:57:50 crc kubenswrapper[4727]: I1001 12:57:50.983443 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-lzvjj"] Oct 01 12:57:50 crc kubenswrapper[4727]: I1001 12:57:50.983689 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" podUID="3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff" containerName="dnsmasq-dns" containerID="cri-o://133a818030e2ad157b3e9dbd8387fb3a35dc671c77977f2c36914c940d21f7ce" gracePeriod=10 Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.119295 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-65bcw"] Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.123613 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.136177 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-65bcw"] Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.268156 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa6f6783-3b1f-4c21-aee4-6f35cf66d17f-config\") pod \"dnsmasq-dns-cb6ffcf87-65bcw\" (UID: \"aa6f6783-3b1f-4c21-aee4-6f35cf66d17f\") " pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.268220 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/aa6f6783-3b1f-4c21-aee4-6f35cf66d17f-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-65bcw\" (UID: \"aa6f6783-3b1f-4c21-aee4-6f35cf66d17f\") " pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.268250 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa6f6783-3b1f-4c21-aee4-6f35cf66d17f-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-65bcw\" (UID: \"aa6f6783-3b1f-4c21-aee4-6f35cf66d17f\") " pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.268278 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa6f6783-3b1f-4c21-aee4-6f35cf66d17f-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-65bcw\" (UID: \"aa6f6783-3b1f-4c21-aee4-6f35cf66d17f\") " pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.268301 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmwwm\" (UniqueName: \"kubernetes.io/projected/aa6f6783-3b1f-4c21-aee4-6f35cf66d17f-kube-api-access-hmwwm\") pod \"dnsmasq-dns-cb6ffcf87-65bcw\" (UID: \"aa6f6783-3b1f-4c21-aee4-6f35cf66d17f\") " pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.268495 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa6f6783-3b1f-4c21-aee4-6f35cf66d17f-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-65bcw\" (UID: \"aa6f6783-3b1f-4c21-aee4-6f35cf66d17f\") " pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.268722 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa6f6783-3b1f-4c21-aee4-6f35cf66d17f-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-65bcw\" (UID: \"aa6f6783-3b1f-4c21-aee4-6f35cf66d17f\") " pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.375259 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa6f6783-3b1f-4c21-aee4-6f35cf66d17f-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-65bcw\" (UID: \"aa6f6783-3b1f-4c21-aee4-6f35cf66d17f\") " pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.375317 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmwwm\" (UniqueName: \"kubernetes.io/projected/aa6f6783-3b1f-4c21-aee4-6f35cf66d17f-kube-api-access-hmwwm\") pod \"dnsmasq-dns-cb6ffcf87-65bcw\" (UID: \"aa6f6783-3b1f-4c21-aee4-6f35cf66d17f\") " pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.375361 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa6f6783-3b1f-4c21-aee4-6f35cf66d17f-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-65bcw\" (UID: \"aa6f6783-3b1f-4c21-aee4-6f35cf66d17f\") " pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.375424 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa6f6783-3b1f-4c21-aee4-6f35cf66d17f-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-65bcw\" (UID: \"aa6f6783-3b1f-4c21-aee4-6f35cf66d17f\") " pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.375478 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa6f6783-3b1f-4c21-aee4-6f35cf66d17f-config\") pod \"dnsmasq-dns-cb6ffcf87-65bcw\" (UID: \"aa6f6783-3b1f-4c21-aee4-6f35cf66d17f\") " pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.375502 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/aa6f6783-3b1f-4c21-aee4-6f35cf66d17f-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-65bcw\" (UID: \"aa6f6783-3b1f-4c21-aee4-6f35cf66d17f\") " pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.375523 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa6f6783-3b1f-4c21-aee4-6f35cf66d17f-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-65bcw\" (UID: \"aa6f6783-3b1f-4c21-aee4-6f35cf66d17f\") " pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.376652 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa6f6783-3b1f-4c21-aee4-6f35cf66d17f-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-65bcw\" (UID: \"aa6f6783-3b1f-4c21-aee4-6f35cf66d17f\") " pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.377218 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa6f6783-3b1f-4c21-aee4-6f35cf66d17f-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-65bcw\" (UID: \"aa6f6783-3b1f-4c21-aee4-6f35cf66d17f\") " pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.379955 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa6f6783-3b1f-4c21-aee4-6f35cf66d17f-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-65bcw\" (UID: \"aa6f6783-3b1f-4c21-aee4-6f35cf66d17f\") " pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.380352 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa6f6783-3b1f-4c21-aee4-6f35cf66d17f-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-65bcw\" (UID: \"aa6f6783-3b1f-4c21-aee4-6f35cf66d17f\") " pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.380703 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/aa6f6783-3b1f-4c21-aee4-6f35cf66d17f-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-65bcw\" (UID: \"aa6f6783-3b1f-4c21-aee4-6f35cf66d17f\") " pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.381607 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa6f6783-3b1f-4c21-aee4-6f35cf66d17f-config\") pod \"dnsmasq-dns-cb6ffcf87-65bcw\" (UID: \"aa6f6783-3b1f-4c21-aee4-6f35cf66d17f\") " pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.404327 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmwwm\" (UniqueName: \"kubernetes.io/projected/aa6f6783-3b1f-4c21-aee4-6f35cf66d17f-kube-api-access-hmwwm\") pod \"dnsmasq-dns-cb6ffcf87-65bcw\" (UID: \"aa6f6783-3b1f-4c21-aee4-6f35cf66d17f\") " pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.477632 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.598325 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.782542 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-ovsdbserver-sb\") pod \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\" (UID: \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\") " Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.782862 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-dns-svc\") pod \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\" (UID: \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\") " Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.782899 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-dns-swift-storage-0\") pod \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\" (UID: \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\") " Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.783071 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9bzf\" (UniqueName: \"kubernetes.io/projected/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-kube-api-access-p9bzf\") pod \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\" (UID: \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\") " Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.783128 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-config\") pod \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\" (UID: \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\") " Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.783168 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-ovsdbserver-nb\") pod \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\" (UID: \"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff\") " Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.787557 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-kube-api-access-p9bzf" (OuterVolumeSpecName: "kube-api-access-p9bzf") pod "3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff" (UID: "3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff"). InnerVolumeSpecName "kube-api-access-p9bzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.831971 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff" (UID: "3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.833207 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff" (UID: "3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.834134 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-config" (OuterVolumeSpecName: "config") pod "3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff" (UID: "3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.837418 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff" (UID: "3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.840892 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff" (UID: "3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.885422 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9bzf\" (UniqueName: \"kubernetes.io/projected/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-kube-api-access-p9bzf\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.885452 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.885496 4727 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.885507 4727 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.885515 4727 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.885523 4727 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.919439 4727 generic.go:334] "Generic (PLEG): container finished" podID="3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff" containerID="133a818030e2ad157b3e9dbd8387fb3a35dc671c77977f2c36914c940d21f7ce" exitCode=0 Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.919486 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.919484 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" event={"ID":"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff","Type":"ContainerDied","Data":"133a818030e2ad157b3e9dbd8387fb3a35dc671c77977f2c36914c940d21f7ce"} Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.919548 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-lzvjj" event={"ID":"3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff","Type":"ContainerDied","Data":"807f1b96b17d8999cf70fc05d8885f7b38bbd3fe0dd7d4f12b8ec25e7f15803d"} Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.919601 4727 scope.go:117] "RemoveContainer" containerID="133a818030e2ad157b3e9dbd8387fb3a35dc671c77977f2c36914c940d21f7ce" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.923792 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-65bcw"] Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.940853 4727 scope.go:117] "RemoveContainer" containerID="2a3c96ded502cb081fa8f39f3cbc04384e408297e9e2b596eaa95b0dfa5ab701" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.954056 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-lzvjj"] Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.962034 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-lzvjj"] Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.977446 4727 scope.go:117] "RemoveContainer" containerID="133a818030e2ad157b3e9dbd8387fb3a35dc671c77977f2c36914c940d21f7ce" Oct 01 12:57:51 crc kubenswrapper[4727]: E1001 12:57:51.977774 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"133a818030e2ad157b3e9dbd8387fb3a35dc671c77977f2c36914c940d21f7ce\": container with ID starting with 133a818030e2ad157b3e9dbd8387fb3a35dc671c77977f2c36914c940d21f7ce not found: ID does not exist" containerID="133a818030e2ad157b3e9dbd8387fb3a35dc671c77977f2c36914c940d21f7ce" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.977802 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"133a818030e2ad157b3e9dbd8387fb3a35dc671c77977f2c36914c940d21f7ce"} err="failed to get container status \"133a818030e2ad157b3e9dbd8387fb3a35dc671c77977f2c36914c940d21f7ce\": rpc error: code = NotFound desc = could not find container \"133a818030e2ad157b3e9dbd8387fb3a35dc671c77977f2c36914c940d21f7ce\": container with ID starting with 133a818030e2ad157b3e9dbd8387fb3a35dc671c77977f2c36914c940d21f7ce not found: ID does not exist" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.977821 4727 scope.go:117] "RemoveContainer" containerID="2a3c96ded502cb081fa8f39f3cbc04384e408297e9e2b596eaa95b0dfa5ab701" Oct 01 12:57:51 crc kubenswrapper[4727]: E1001 12:57:51.978222 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a3c96ded502cb081fa8f39f3cbc04384e408297e9e2b596eaa95b0dfa5ab701\": container with ID starting with 2a3c96ded502cb081fa8f39f3cbc04384e408297e9e2b596eaa95b0dfa5ab701 not found: ID does not exist" containerID="2a3c96ded502cb081fa8f39f3cbc04384e408297e9e2b596eaa95b0dfa5ab701" Oct 01 12:57:51 crc kubenswrapper[4727]: I1001 12:57:51.978303 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a3c96ded502cb081fa8f39f3cbc04384e408297e9e2b596eaa95b0dfa5ab701"} err="failed to get container status \"2a3c96ded502cb081fa8f39f3cbc04384e408297e9e2b596eaa95b0dfa5ab701\": rpc error: code = NotFound desc = could not find container \"2a3c96ded502cb081fa8f39f3cbc04384e408297e9e2b596eaa95b0dfa5ab701\": container with ID starting with 2a3c96ded502cb081fa8f39f3cbc04384e408297e9e2b596eaa95b0dfa5ab701 not found: ID does not exist" Oct 01 12:57:52 crc kubenswrapper[4727]: I1001 12:57:52.385753 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff" path="/var/lib/kubelet/pods/3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff/volumes" Oct 01 12:57:52 crc kubenswrapper[4727]: I1001 12:57:52.931753 4727 generic.go:334] "Generic (PLEG): container finished" podID="aa6f6783-3b1f-4c21-aee4-6f35cf66d17f" containerID="95242f65dd1f69a5941ffdcc63316c91e8029ac1b2811e895c290e99a5c493ce" exitCode=0 Oct 01 12:57:52 crc kubenswrapper[4727]: I1001 12:57:52.931813 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" event={"ID":"aa6f6783-3b1f-4c21-aee4-6f35cf66d17f","Type":"ContainerDied","Data":"95242f65dd1f69a5941ffdcc63316c91e8029ac1b2811e895c290e99a5c493ce"} Oct 01 12:57:52 crc kubenswrapper[4727]: I1001 12:57:52.931850 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" event={"ID":"aa6f6783-3b1f-4c21-aee4-6f35cf66d17f","Type":"ContainerStarted","Data":"d46829be974e84c9241546ae54046b1a027c0f515058ac5f77668c8a14f895c4"} Oct 01 12:57:53 crc kubenswrapper[4727]: I1001 12:57:53.944086 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" event={"ID":"aa6f6783-3b1f-4c21-aee4-6f35cf66d17f","Type":"ContainerStarted","Data":"cb5464ec87ab1ae108816b088a031cfd7e8d9eec4286440ff600eadc9c41153a"} Oct 01 12:57:53 crc kubenswrapper[4727]: I1001 12:57:53.945567 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" Oct 01 12:57:53 crc kubenswrapper[4727]: I1001 12:57:53.969240 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" podStartSLOduration=2.969224518 podStartE2EDuration="2.969224518s" podCreationTimestamp="2025-10-01 12:57:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:57:53.968198305 +0000 UTC m=+1252.289553152" watchObservedRunningTime="2025-10-01 12:57:53.969224518 +0000 UTC m=+1252.290579355" Oct 01 12:58:00 crc kubenswrapper[4727]: I1001 12:58:00.002803 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 01 12:58:01 crc kubenswrapper[4727]: I1001 12:58:01.480184 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cb6ffcf87-65bcw" Oct 01 12:58:01 crc kubenswrapper[4727]: I1001 12:58:01.567244 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-hcj6p"] Oct 01 12:58:01 crc kubenswrapper[4727]: I1001 12:58:01.567530 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" podUID="e19b79e9-2913-42e3-9257-b2750475ace3" containerName="dnsmasq-dns" containerID="cri-o://caabaacb4f936f97251ae8d7461306a7c895a2418f40a12965e723fecf38b167" gracePeriod=10 Oct 01 12:58:02 crc kubenswrapper[4727]: I1001 12:58:02.023250 4727 generic.go:334] "Generic (PLEG): container finished" podID="e19b79e9-2913-42e3-9257-b2750475ace3" containerID="caabaacb4f936f97251ae8d7461306a7c895a2418f40a12965e723fecf38b167" exitCode=0 Oct 01 12:58:02 crc kubenswrapper[4727]: I1001 12:58:02.023293 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" event={"ID":"e19b79e9-2913-42e3-9257-b2750475ace3","Type":"ContainerDied","Data":"caabaacb4f936f97251ae8d7461306a7c895a2418f40a12965e723fecf38b167"} Oct 01 12:58:02 crc kubenswrapper[4727]: I1001 12:58:02.023318 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" event={"ID":"e19b79e9-2913-42e3-9257-b2750475ace3","Type":"ContainerDied","Data":"ccecdc02f750c6a75fc26df32cc08fafddad6a99c27fdd9e2963edc42687ff49"} Oct 01 12:58:02 crc kubenswrapper[4727]: I1001 12:58:02.023344 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccecdc02f750c6a75fc26df32cc08fafddad6a99c27fdd9e2963edc42687ff49" Oct 01 12:58:02 crc kubenswrapper[4727]: I1001 12:58:02.100403 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" Oct 01 12:58:02 crc kubenswrapper[4727]: I1001 12:58:02.222975 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-ovsdbserver-nb\") pod \"e19b79e9-2913-42e3-9257-b2750475ace3\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " Oct 01 12:58:02 crc kubenswrapper[4727]: I1001 12:58:02.223054 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fjs9\" (UniqueName: \"kubernetes.io/projected/e19b79e9-2913-42e3-9257-b2750475ace3-kube-api-access-6fjs9\") pod \"e19b79e9-2913-42e3-9257-b2750475ace3\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " Oct 01 12:58:02 crc kubenswrapper[4727]: I1001 12:58:02.223206 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-ovsdbserver-sb\") pod \"e19b79e9-2913-42e3-9257-b2750475ace3\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " Oct 01 12:58:02 crc kubenswrapper[4727]: I1001 12:58:02.223312 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-openstack-edpm-ipam\") pod \"e19b79e9-2913-42e3-9257-b2750475ace3\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " Oct 01 12:58:02 crc kubenswrapper[4727]: I1001 12:58:02.223360 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-dns-svc\") pod \"e19b79e9-2913-42e3-9257-b2750475ace3\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " Oct 01 12:58:02 crc kubenswrapper[4727]: I1001 12:58:02.223412 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-dns-swift-storage-0\") pod \"e19b79e9-2913-42e3-9257-b2750475ace3\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " Oct 01 12:58:02 crc kubenswrapper[4727]: I1001 12:58:02.223454 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-config\") pod \"e19b79e9-2913-42e3-9257-b2750475ace3\" (UID: \"e19b79e9-2913-42e3-9257-b2750475ace3\") " Oct 01 12:58:02 crc kubenswrapper[4727]: I1001 12:58:02.230318 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e19b79e9-2913-42e3-9257-b2750475ace3-kube-api-access-6fjs9" (OuterVolumeSpecName: "kube-api-access-6fjs9") pod "e19b79e9-2913-42e3-9257-b2750475ace3" (UID: "e19b79e9-2913-42e3-9257-b2750475ace3"). InnerVolumeSpecName "kube-api-access-6fjs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:58:02 crc kubenswrapper[4727]: I1001 12:58:02.276963 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-config" (OuterVolumeSpecName: "config") pod "e19b79e9-2913-42e3-9257-b2750475ace3" (UID: "e19b79e9-2913-42e3-9257-b2750475ace3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:58:02 crc kubenswrapper[4727]: I1001 12:58:02.279376 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e19b79e9-2913-42e3-9257-b2750475ace3" (UID: "e19b79e9-2913-42e3-9257-b2750475ace3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:58:02 crc kubenswrapper[4727]: I1001 12:58:02.282561 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "e19b79e9-2913-42e3-9257-b2750475ace3" (UID: "e19b79e9-2913-42e3-9257-b2750475ace3"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:58:02 crc kubenswrapper[4727]: I1001 12:58:02.296263 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e19b79e9-2913-42e3-9257-b2750475ace3" (UID: "e19b79e9-2913-42e3-9257-b2750475ace3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:58:02 crc kubenswrapper[4727]: I1001 12:58:02.300773 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e19b79e9-2913-42e3-9257-b2750475ace3" (UID: "e19b79e9-2913-42e3-9257-b2750475ace3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:58:02 crc kubenswrapper[4727]: I1001 12:58:02.310320 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e19b79e9-2913-42e3-9257-b2750475ace3" (UID: "e19b79e9-2913-42e3-9257-b2750475ace3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 12:58:02 crc kubenswrapper[4727]: I1001 12:58:02.325095 4727 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 01 12:58:02 crc kubenswrapper[4727]: I1001 12:58:02.325138 4727 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 12:58:02 crc kubenswrapper[4727]: I1001 12:58:02.325154 4727 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 12:58:02 crc kubenswrapper[4727]: I1001 12:58:02.325165 4727 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-config\") on node \"crc\" DevicePath \"\"" Oct 01 12:58:02 crc kubenswrapper[4727]: I1001 12:58:02.325176 4727 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 12:58:02 crc kubenswrapper[4727]: I1001 12:58:02.325189 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fjs9\" (UniqueName: \"kubernetes.io/projected/e19b79e9-2913-42e3-9257-b2750475ace3-kube-api-access-6fjs9\") on node \"crc\" DevicePath \"\"" Oct 01 12:58:02 crc kubenswrapper[4727]: I1001 12:58:02.325201 4727 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e19b79e9-2913-42e3-9257-b2750475ace3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 12:58:03 crc kubenswrapper[4727]: I1001 12:58:03.030472 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-hcj6p" Oct 01 12:58:03 crc kubenswrapper[4727]: I1001 12:58:03.054427 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-hcj6p"] Oct 01 12:58:03 crc kubenswrapper[4727]: I1001 12:58:03.063630 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-hcj6p"] Oct 01 12:58:03 crc kubenswrapper[4727]: E1001 12:58:03.666138 4727 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode19b79e9_2913_42e3_9257_b2750475ace3.slice/crio-ccecdc02f750c6a75fc26df32cc08fafddad6a99c27fdd9e2963edc42687ff49\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode19b79e9_2913_42e3_9257_b2750475ace3.slice\": RecentStats: unable to find data in memory cache]" Oct 01 12:58:04 crc kubenswrapper[4727]: I1001 12:58:04.383139 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e19b79e9-2913-42e3-9257-b2750475ace3" path="/var/lib/kubelet/pods/e19b79e9-2913-42e3-9257-b2750475ace3/volumes" Oct 01 12:58:08 crc kubenswrapper[4727]: I1001 12:58:08.847686 4727 scope.go:117] "RemoveContainer" containerID="f1fb8e78533bbf52baf5a0c050a858cd1f1c1685dfb78834d97e9dca7b4d4504" Oct 01 12:58:13 crc kubenswrapper[4727]: E1001 12:58:13.906872 4727 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode19b79e9_2913_42e3_9257_b2750475ace3.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode19b79e9_2913_42e3_9257_b2750475ace3.slice/crio-ccecdc02f750c6a75fc26df32cc08fafddad6a99c27fdd9e2963edc42687ff49\": RecentStats: unable to find data in memory cache]" Oct 01 12:58:14 crc kubenswrapper[4727]: I1001 12:58:14.453471 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf"] Oct 01 12:58:14 crc kubenswrapper[4727]: E1001 12:58:14.453969 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e19b79e9-2913-42e3-9257-b2750475ace3" containerName="dnsmasq-dns" Oct 01 12:58:14 crc kubenswrapper[4727]: I1001 12:58:14.453993 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="e19b79e9-2913-42e3-9257-b2750475ace3" containerName="dnsmasq-dns" Oct 01 12:58:14 crc kubenswrapper[4727]: E1001 12:58:14.454034 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff" containerName="dnsmasq-dns" Oct 01 12:58:14 crc kubenswrapper[4727]: I1001 12:58:14.454043 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff" containerName="dnsmasq-dns" Oct 01 12:58:14 crc kubenswrapper[4727]: E1001 12:58:14.454067 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff" containerName="init" Oct 01 12:58:14 crc kubenswrapper[4727]: I1001 12:58:14.454076 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff" containerName="init" Oct 01 12:58:14 crc kubenswrapper[4727]: E1001 12:58:14.454094 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e19b79e9-2913-42e3-9257-b2750475ace3" containerName="init" Oct 01 12:58:14 crc kubenswrapper[4727]: I1001 12:58:14.454101 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="e19b79e9-2913-42e3-9257-b2750475ace3" containerName="init" Oct 01 12:58:14 crc kubenswrapper[4727]: I1001 12:58:14.454338 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="e19b79e9-2913-42e3-9257-b2750475ace3" containerName="dnsmasq-dns" Oct 01 12:58:14 crc kubenswrapper[4727]: I1001 12:58:14.454354 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d7e9ecd-3bb9-4f1a-a8f5-eb31340566ff" containerName="dnsmasq-dns" Oct 01 12:58:14 crc kubenswrapper[4727]: I1001 12:58:14.455170 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf" Oct 01 12:58:14 crc kubenswrapper[4727]: I1001 12:58:14.457558 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jcjb6" Oct 01 12:58:14 crc kubenswrapper[4727]: I1001 12:58:14.457711 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 12:58:14 crc kubenswrapper[4727]: I1001 12:58:14.458425 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 12:58:14 crc kubenswrapper[4727]: I1001 12:58:14.458826 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 12:58:14 crc kubenswrapper[4727]: I1001 12:58:14.462093 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf"] Oct 01 12:58:14 crc kubenswrapper[4727]: I1001 12:58:14.561632 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b03652-a89c-43d2-9cef-c78c540c52a8-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf\" (UID: \"87b03652-a89c-43d2-9cef-c78c540c52a8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf" Oct 01 12:58:14 crc kubenswrapper[4727]: I1001 12:58:14.561880 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdlq2\" (UniqueName: \"kubernetes.io/projected/87b03652-a89c-43d2-9cef-c78c540c52a8-kube-api-access-sdlq2\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf\" (UID: \"87b03652-a89c-43d2-9cef-c78c540c52a8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf" Oct 01 12:58:14 crc kubenswrapper[4727]: I1001 12:58:14.562404 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87b03652-a89c-43d2-9cef-c78c540c52a8-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf\" (UID: \"87b03652-a89c-43d2-9cef-c78c540c52a8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf" Oct 01 12:58:14 crc kubenswrapper[4727]: I1001 12:58:14.562449 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87b03652-a89c-43d2-9cef-c78c540c52a8-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf\" (UID: \"87b03652-a89c-43d2-9cef-c78c540c52a8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf" Oct 01 12:58:14 crc kubenswrapper[4727]: I1001 12:58:14.664482 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdlq2\" (UniqueName: \"kubernetes.io/projected/87b03652-a89c-43d2-9cef-c78c540c52a8-kube-api-access-sdlq2\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf\" (UID: \"87b03652-a89c-43d2-9cef-c78c540c52a8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf" Oct 01 12:58:14 crc kubenswrapper[4727]: I1001 12:58:14.664774 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87b03652-a89c-43d2-9cef-c78c540c52a8-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf\" (UID: \"87b03652-a89c-43d2-9cef-c78c540c52a8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf" Oct 01 12:58:14 crc kubenswrapper[4727]: I1001 12:58:14.664804 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87b03652-a89c-43d2-9cef-c78c540c52a8-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf\" (UID: \"87b03652-a89c-43d2-9cef-c78c540c52a8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf" Oct 01 12:58:14 crc kubenswrapper[4727]: I1001 12:58:14.664888 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b03652-a89c-43d2-9cef-c78c540c52a8-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf\" (UID: \"87b03652-a89c-43d2-9cef-c78c540c52a8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf" Oct 01 12:58:14 crc kubenswrapper[4727]: I1001 12:58:14.671499 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87b03652-a89c-43d2-9cef-c78c540c52a8-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf\" (UID: \"87b03652-a89c-43d2-9cef-c78c540c52a8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf" Oct 01 12:58:14 crc kubenswrapper[4727]: I1001 12:58:14.671546 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b03652-a89c-43d2-9cef-c78c540c52a8-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf\" (UID: \"87b03652-a89c-43d2-9cef-c78c540c52a8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf" Oct 01 12:58:14 crc kubenswrapper[4727]: I1001 12:58:14.680500 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87b03652-a89c-43d2-9cef-c78c540c52a8-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf\" (UID: \"87b03652-a89c-43d2-9cef-c78c540c52a8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf" Oct 01 12:58:14 crc kubenswrapper[4727]: I1001 12:58:14.683924 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdlq2\" (UniqueName: \"kubernetes.io/projected/87b03652-a89c-43d2-9cef-c78c540c52a8-kube-api-access-sdlq2\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf\" (UID: \"87b03652-a89c-43d2-9cef-c78c540c52a8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf" Oct 01 12:58:14 crc kubenswrapper[4727]: I1001 12:58:14.781860 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf" Oct 01 12:58:15 crc kubenswrapper[4727]: I1001 12:58:15.322667 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf"] Oct 01 12:58:15 crc kubenswrapper[4727]: W1001 12:58:15.327209 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87b03652_a89c_43d2_9cef_c78c540c52a8.slice/crio-e11b4e7f4b367ea7c728af29ec38a0432334a85421a7f94d00b030619fc2fa43 WatchSource:0}: Error finding container e11b4e7f4b367ea7c728af29ec38a0432334a85421a7f94d00b030619fc2fa43: Status 404 returned error can't find the container with id e11b4e7f4b367ea7c728af29ec38a0432334a85421a7f94d00b030619fc2fa43 Oct 01 12:58:16 crc kubenswrapper[4727]: I1001 12:58:16.173237 4727 generic.go:334] "Generic (PLEG): container finished" podID="4ed96f0b-b8d7-47f1-aa9c-3af04e140681" containerID="f96af944bd637b462be21e27b7715f1c10301e8b70ed2a6505b40740e96acb28" exitCode=0 Oct 01 12:58:16 crc kubenswrapper[4727]: I1001 12:58:16.173467 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4ed96f0b-b8d7-47f1-aa9c-3af04e140681","Type":"ContainerDied","Data":"f96af944bd637b462be21e27b7715f1c10301e8b70ed2a6505b40740e96acb28"} Oct 01 12:58:16 crc kubenswrapper[4727]: I1001 12:58:16.183969 4727 generic.go:334] "Generic (PLEG): container finished" podID="cd3bde15-3916-4632-97e6-50a7a6d2c60f" containerID="22fba6d0a76c31599f66112952fea22450c9fe0b0311aec0d8ee8f929fbfa62a" exitCode=0 Oct 01 12:58:16 crc kubenswrapper[4727]: I1001 12:58:16.184054 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cd3bde15-3916-4632-97e6-50a7a6d2c60f","Type":"ContainerDied","Data":"22fba6d0a76c31599f66112952fea22450c9fe0b0311aec0d8ee8f929fbfa62a"} Oct 01 12:58:16 crc kubenswrapper[4727]: I1001 12:58:16.188176 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf" event={"ID":"87b03652-a89c-43d2-9cef-c78c540c52a8","Type":"ContainerStarted","Data":"e11b4e7f4b367ea7c728af29ec38a0432334a85421a7f94d00b030619fc2fa43"} Oct 01 12:58:17 crc kubenswrapper[4727]: I1001 12:58:17.208334 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cd3bde15-3916-4632-97e6-50a7a6d2c60f","Type":"ContainerStarted","Data":"e772e31bd6a1148fbb49af7b275c3fd0e8cae2ff7c1fa98144698364a46f7a8f"} Oct 01 12:58:17 crc kubenswrapper[4727]: I1001 12:58:17.208839 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:58:17 crc kubenswrapper[4727]: I1001 12:58:17.216416 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4ed96f0b-b8d7-47f1-aa9c-3af04e140681","Type":"ContainerStarted","Data":"96abb25a2df8ef2ac5dcc0a6abe29ba11d1d93a4020e6d752a185ecd7f3fcc83"} Oct 01 12:58:17 crc kubenswrapper[4727]: I1001 12:58:17.217337 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 01 12:58:17 crc kubenswrapper[4727]: I1001 12:58:17.239705 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.239683932 podStartE2EDuration="36.239683932s" podCreationTimestamp="2025-10-01 12:57:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:58:17.23260569 +0000 UTC m=+1275.553960547" watchObservedRunningTime="2025-10-01 12:58:17.239683932 +0000 UTC m=+1275.561038769" Oct 01 12:58:17 crc kubenswrapper[4727]: I1001 12:58:17.267023 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.266991566 podStartE2EDuration="36.266991566s" podCreationTimestamp="2025-10-01 12:57:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 12:58:17.256454237 +0000 UTC m=+1275.577809084" watchObservedRunningTime="2025-10-01 12:58:17.266991566 +0000 UTC m=+1275.588346403" Oct 01 12:58:24 crc kubenswrapper[4727]: E1001 12:58:24.146294 4727 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode19b79e9_2913_42e3_9257_b2750475ace3.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode19b79e9_2913_42e3_9257_b2750475ace3.slice/crio-ccecdc02f750c6a75fc26df32cc08fafddad6a99c27fdd9e2963edc42687ff49\": RecentStats: unable to find data in memory cache]" Oct 01 12:58:25 crc kubenswrapper[4727]: I1001 12:58:25.316730 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf" event={"ID":"87b03652-a89c-43d2-9cef-c78c540c52a8","Type":"ContainerStarted","Data":"aa15a9bdc64a05f6ba74aeb886df3019354533c1d9c8be90b7c61b4b17584023"} Oct 01 12:58:25 crc kubenswrapper[4727]: I1001 12:58:25.345348 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf" podStartSLOduration=2.275176868 podStartE2EDuration="11.345320287s" podCreationTimestamp="2025-10-01 12:58:14 +0000 UTC" firstStartedPulling="2025-10-01 12:58:15.328723668 +0000 UTC m=+1273.650078505" lastFinishedPulling="2025-10-01 12:58:24.398867087 +0000 UTC m=+1282.720221924" observedRunningTime="2025-10-01 12:58:25.336184671 +0000 UTC m=+1283.657539508" watchObservedRunningTime="2025-10-01 12:58:25.345320287 +0000 UTC m=+1283.666675124" Oct 01 12:58:31 crc kubenswrapper[4727]: I1001 12:58:31.584289 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 01 12:58:31 crc kubenswrapper[4727]: I1001 12:58:31.904239 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 01 12:58:33 crc kubenswrapper[4727]: I1001 12:58:33.292402 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:58:33 crc kubenswrapper[4727]: I1001 12:58:33.292468 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:58:34 crc kubenswrapper[4727]: E1001 12:58:34.373740 4727 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode19b79e9_2913_42e3_9257_b2750475ace3.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode19b79e9_2913_42e3_9257_b2750475ace3.slice/crio-ccecdc02f750c6a75fc26df32cc08fafddad6a99c27fdd9e2963edc42687ff49\": RecentStats: unable to find data in memory cache]" Oct 01 12:58:36 crc kubenswrapper[4727]: I1001 12:58:36.434648 4727 generic.go:334] "Generic (PLEG): container finished" podID="87b03652-a89c-43d2-9cef-c78c540c52a8" containerID="aa15a9bdc64a05f6ba74aeb886df3019354533c1d9c8be90b7c61b4b17584023" exitCode=0 Oct 01 12:58:36 crc kubenswrapper[4727]: I1001 12:58:36.434741 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf" event={"ID":"87b03652-a89c-43d2-9cef-c78c540c52a8","Type":"ContainerDied","Data":"aa15a9bdc64a05f6ba74aeb886df3019354533c1d9c8be90b7c61b4b17584023"} Oct 01 12:58:37 crc kubenswrapper[4727]: I1001 12:58:37.888664 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf" Oct 01 12:58:37 crc kubenswrapper[4727]: I1001 12:58:37.930110 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b03652-a89c-43d2-9cef-c78c540c52a8-repo-setup-combined-ca-bundle\") pod \"87b03652-a89c-43d2-9cef-c78c540c52a8\" (UID: \"87b03652-a89c-43d2-9cef-c78c540c52a8\") " Oct 01 12:58:37 crc kubenswrapper[4727]: I1001 12:58:37.930832 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87b03652-a89c-43d2-9cef-c78c540c52a8-inventory\") pod \"87b03652-a89c-43d2-9cef-c78c540c52a8\" (UID: \"87b03652-a89c-43d2-9cef-c78c540c52a8\") " Oct 01 12:58:37 crc kubenswrapper[4727]: I1001 12:58:37.930960 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87b03652-a89c-43d2-9cef-c78c540c52a8-ssh-key\") pod \"87b03652-a89c-43d2-9cef-c78c540c52a8\" (UID: \"87b03652-a89c-43d2-9cef-c78c540c52a8\") " Oct 01 12:58:37 crc kubenswrapper[4727]: I1001 12:58:37.931037 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdlq2\" (UniqueName: \"kubernetes.io/projected/87b03652-a89c-43d2-9cef-c78c540c52a8-kube-api-access-sdlq2\") pod \"87b03652-a89c-43d2-9cef-c78c540c52a8\" (UID: \"87b03652-a89c-43d2-9cef-c78c540c52a8\") " Oct 01 12:58:37 crc kubenswrapper[4727]: I1001 12:58:37.936430 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87b03652-a89c-43d2-9cef-c78c540c52a8-kube-api-access-sdlq2" (OuterVolumeSpecName: "kube-api-access-sdlq2") pod "87b03652-a89c-43d2-9cef-c78c540c52a8" (UID: "87b03652-a89c-43d2-9cef-c78c540c52a8"). InnerVolumeSpecName "kube-api-access-sdlq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:58:37 crc kubenswrapper[4727]: I1001 12:58:37.936610 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b03652-a89c-43d2-9cef-c78c540c52a8-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "87b03652-a89c-43d2-9cef-c78c540c52a8" (UID: "87b03652-a89c-43d2-9cef-c78c540c52a8"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:58:37 crc kubenswrapper[4727]: I1001 12:58:37.962901 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b03652-a89c-43d2-9cef-c78c540c52a8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "87b03652-a89c-43d2-9cef-c78c540c52a8" (UID: "87b03652-a89c-43d2-9cef-c78c540c52a8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:58:37 crc kubenswrapper[4727]: I1001 12:58:37.963044 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b03652-a89c-43d2-9cef-c78c540c52a8-inventory" (OuterVolumeSpecName: "inventory") pod "87b03652-a89c-43d2-9cef-c78c540c52a8" (UID: "87b03652-a89c-43d2-9cef-c78c540c52a8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:58:38 crc kubenswrapper[4727]: I1001 12:58:38.033670 4727 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87b03652-a89c-43d2-9cef-c78c540c52a8-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 12:58:38 crc kubenswrapper[4727]: I1001 12:58:38.033726 4727 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87b03652-a89c-43d2-9cef-c78c540c52a8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 12:58:38 crc kubenswrapper[4727]: I1001 12:58:38.033740 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdlq2\" (UniqueName: \"kubernetes.io/projected/87b03652-a89c-43d2-9cef-c78c540c52a8-kube-api-access-sdlq2\") on node \"crc\" DevicePath \"\"" Oct 01 12:58:38 crc kubenswrapper[4727]: I1001 12:58:38.033756 4727 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b03652-a89c-43d2-9cef-c78c540c52a8-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 12:58:38 crc kubenswrapper[4727]: I1001 12:58:38.455425 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf" event={"ID":"87b03652-a89c-43d2-9cef-c78c540c52a8","Type":"ContainerDied","Data":"e11b4e7f4b367ea7c728af29ec38a0432334a85421a7f94d00b030619fc2fa43"} Oct 01 12:58:38 crc kubenswrapper[4727]: I1001 12:58:38.455464 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e11b4e7f4b367ea7c728af29ec38a0432334a85421a7f94d00b030619fc2fa43" Oct 01 12:58:38 crc kubenswrapper[4727]: I1001 12:58:38.455517 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf" Oct 01 12:58:38 crc kubenswrapper[4727]: I1001 12:58:38.535883 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pjnj"] Oct 01 12:58:38 crc kubenswrapper[4727]: E1001 12:58:38.536298 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87b03652-a89c-43d2-9cef-c78c540c52a8" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 12:58:38 crc kubenswrapper[4727]: I1001 12:58:38.536315 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="87b03652-a89c-43d2-9cef-c78c540c52a8" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 12:58:38 crc kubenswrapper[4727]: I1001 12:58:38.536717 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="87b03652-a89c-43d2-9cef-c78c540c52a8" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 12:58:38 crc kubenswrapper[4727]: I1001 12:58:38.537382 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pjnj" Oct 01 12:58:38 crc kubenswrapper[4727]: I1001 12:58:38.540414 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jcjb6" Oct 01 12:58:38 crc kubenswrapper[4727]: I1001 12:58:38.540468 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 12:58:38 crc kubenswrapper[4727]: I1001 12:58:38.540414 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 12:58:38 crc kubenswrapper[4727]: I1001 12:58:38.540683 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 12:58:38 crc kubenswrapper[4727]: I1001 12:58:38.556916 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pjnj"] Oct 01 12:58:38 crc kubenswrapper[4727]: I1001 12:58:38.644521 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef6454da-b104-45bf-870f-feecead2142f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2pjnj\" (UID: \"ef6454da-b104-45bf-870f-feecead2142f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pjnj" Oct 01 12:58:38 crc kubenswrapper[4727]: I1001 12:58:38.644928 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlk76\" (UniqueName: \"kubernetes.io/projected/ef6454da-b104-45bf-870f-feecead2142f-kube-api-access-dlk76\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2pjnj\" (UID: \"ef6454da-b104-45bf-870f-feecead2142f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pjnj" Oct 01 12:58:38 crc kubenswrapper[4727]: I1001 12:58:38.644960 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef6454da-b104-45bf-870f-feecead2142f-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2pjnj\" (UID: \"ef6454da-b104-45bf-870f-feecead2142f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pjnj" Oct 01 12:58:38 crc kubenswrapper[4727]: I1001 12:58:38.747103 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef6454da-b104-45bf-870f-feecead2142f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2pjnj\" (UID: \"ef6454da-b104-45bf-870f-feecead2142f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pjnj" Oct 01 12:58:38 crc kubenswrapper[4727]: I1001 12:58:38.747184 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlk76\" (UniqueName: \"kubernetes.io/projected/ef6454da-b104-45bf-870f-feecead2142f-kube-api-access-dlk76\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2pjnj\" (UID: \"ef6454da-b104-45bf-870f-feecead2142f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pjnj" Oct 01 12:58:38 crc kubenswrapper[4727]: I1001 12:58:38.747210 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef6454da-b104-45bf-870f-feecead2142f-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2pjnj\" (UID: \"ef6454da-b104-45bf-870f-feecead2142f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pjnj" Oct 01 12:58:38 crc kubenswrapper[4727]: I1001 12:58:38.751411 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef6454da-b104-45bf-870f-feecead2142f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2pjnj\" (UID: \"ef6454da-b104-45bf-870f-feecead2142f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pjnj" Oct 01 12:58:38 crc kubenswrapper[4727]: I1001 12:58:38.752110 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef6454da-b104-45bf-870f-feecead2142f-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2pjnj\" (UID: \"ef6454da-b104-45bf-870f-feecead2142f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pjnj" Oct 01 12:58:38 crc kubenswrapper[4727]: I1001 12:58:38.770290 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlk76\" (UniqueName: \"kubernetes.io/projected/ef6454da-b104-45bf-870f-feecead2142f-kube-api-access-dlk76\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2pjnj\" (UID: \"ef6454da-b104-45bf-870f-feecead2142f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pjnj" Oct 01 12:58:38 crc kubenswrapper[4727]: I1001 12:58:38.855899 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pjnj" Oct 01 12:58:39 crc kubenswrapper[4727]: W1001 12:58:39.389299 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef6454da_b104_45bf_870f_feecead2142f.slice/crio-e9f2feb8d2d4782207a5a8997f19827dee2e6b66be48ddfeeef7d51cf7ee4fb8 WatchSource:0}: Error finding container e9f2feb8d2d4782207a5a8997f19827dee2e6b66be48ddfeeef7d51cf7ee4fb8: Status 404 returned error can't find the container with id e9f2feb8d2d4782207a5a8997f19827dee2e6b66be48ddfeeef7d51cf7ee4fb8 Oct 01 12:58:39 crc kubenswrapper[4727]: I1001 12:58:39.390069 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pjnj"] Oct 01 12:58:39 crc kubenswrapper[4727]: I1001 12:58:39.470441 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pjnj" event={"ID":"ef6454da-b104-45bf-870f-feecead2142f","Type":"ContainerStarted","Data":"e9f2feb8d2d4782207a5a8997f19827dee2e6b66be48ddfeeef7d51cf7ee4fb8"} Oct 01 12:58:40 crc kubenswrapper[4727]: I1001 12:58:40.480601 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pjnj" event={"ID":"ef6454da-b104-45bf-870f-feecead2142f","Type":"ContainerStarted","Data":"eb6f4e39b7f88e8b88999b2264a1f0480e54d66d2e541ec6159b2ce41a83e100"} Oct 01 12:58:40 crc kubenswrapper[4727]: I1001 12:58:40.499534 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pjnj" podStartSLOduration=2.062795932 podStartE2EDuration="2.499515744s" podCreationTimestamp="2025-10-01 12:58:38 +0000 UTC" firstStartedPulling="2025-10-01 12:58:39.393142932 +0000 UTC m=+1297.714497779" lastFinishedPulling="2025-10-01 12:58:39.829862754 +0000 UTC m=+1298.151217591" observedRunningTime="2025-10-01 12:58:40.493644301 +0000 UTC m=+1298.814999148" watchObservedRunningTime="2025-10-01 12:58:40.499515744 +0000 UTC m=+1298.820870581" Oct 01 12:58:43 crc kubenswrapper[4727]: I1001 12:58:43.511617 4727 generic.go:334] "Generic (PLEG): container finished" podID="ef6454da-b104-45bf-870f-feecead2142f" containerID="eb6f4e39b7f88e8b88999b2264a1f0480e54d66d2e541ec6159b2ce41a83e100" exitCode=0 Oct 01 12:58:43 crc kubenswrapper[4727]: I1001 12:58:43.511698 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pjnj" event={"ID":"ef6454da-b104-45bf-870f-feecead2142f","Type":"ContainerDied","Data":"eb6f4e39b7f88e8b88999b2264a1f0480e54d66d2e541ec6159b2ce41a83e100"} Oct 01 12:58:44 crc kubenswrapper[4727]: E1001 12:58:44.622510 4727 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode19b79e9_2913_42e3_9257_b2750475ace3.slice/crio-ccecdc02f750c6a75fc26df32cc08fafddad6a99c27fdd9e2963edc42687ff49\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode19b79e9_2913_42e3_9257_b2750475ace3.slice\": RecentStats: unable to find data in memory cache]" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.023610 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pjnj" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.100924 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef6454da-b104-45bf-870f-feecead2142f-inventory\") pod \"ef6454da-b104-45bf-870f-feecead2142f\" (UID: \"ef6454da-b104-45bf-870f-feecead2142f\") " Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.101046 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef6454da-b104-45bf-870f-feecead2142f-ssh-key\") pod \"ef6454da-b104-45bf-870f-feecead2142f\" (UID: \"ef6454da-b104-45bf-870f-feecead2142f\") " Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.101130 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlk76\" (UniqueName: \"kubernetes.io/projected/ef6454da-b104-45bf-870f-feecead2142f-kube-api-access-dlk76\") pod \"ef6454da-b104-45bf-870f-feecead2142f\" (UID: \"ef6454da-b104-45bf-870f-feecead2142f\") " Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.108495 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef6454da-b104-45bf-870f-feecead2142f-kube-api-access-dlk76" (OuterVolumeSpecName: "kube-api-access-dlk76") pod "ef6454da-b104-45bf-870f-feecead2142f" (UID: "ef6454da-b104-45bf-870f-feecead2142f"). InnerVolumeSpecName "kube-api-access-dlk76". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.131578 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef6454da-b104-45bf-870f-feecead2142f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ef6454da-b104-45bf-870f-feecead2142f" (UID: "ef6454da-b104-45bf-870f-feecead2142f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.131934 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef6454da-b104-45bf-870f-feecead2142f-inventory" (OuterVolumeSpecName: "inventory") pod "ef6454da-b104-45bf-870f-feecead2142f" (UID: "ef6454da-b104-45bf-870f-feecead2142f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.204351 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlk76\" (UniqueName: \"kubernetes.io/projected/ef6454da-b104-45bf-870f-feecead2142f-kube-api-access-dlk76\") on node \"crc\" DevicePath \"\"" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.204392 4727 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef6454da-b104-45bf-870f-feecead2142f-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.204403 4727 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef6454da-b104-45bf-870f-feecead2142f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.534875 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pjnj" event={"ID":"ef6454da-b104-45bf-870f-feecead2142f","Type":"ContainerDied","Data":"e9f2feb8d2d4782207a5a8997f19827dee2e6b66be48ddfeeef7d51cf7ee4fb8"} Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.535397 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9f2feb8d2d4782207a5a8997f19827dee2e6b66be48ddfeeef7d51cf7ee4fb8" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.535277 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2pjnj" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.612353 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx"] Oct 01 12:58:45 crc kubenswrapper[4727]: E1001 12:58:45.612735 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef6454da-b104-45bf-870f-feecead2142f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.612752 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef6454da-b104-45bf-870f-feecead2142f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.612950 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef6454da-b104-45bf-870f-feecead2142f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.613899 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.619122 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.619567 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.619908 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.620048 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jcjb6" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.626280 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx"] Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.712804 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2f2522c5-4bf3-4d82-af9a-546abdb6c4be-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx\" (UID: \"2f2522c5-4bf3-4d82-af9a-546abdb6c4be\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.712863 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f2522c5-4bf3-4d82-af9a-546abdb6c4be-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx\" (UID: \"2f2522c5-4bf3-4d82-af9a-546abdb6c4be\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.712903 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f2522c5-4bf3-4d82-af9a-546abdb6c4be-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx\" (UID: \"2f2522c5-4bf3-4d82-af9a-546abdb6c4be\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.712986 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxhhr\" (UniqueName: \"kubernetes.io/projected/2f2522c5-4bf3-4d82-af9a-546abdb6c4be-kube-api-access-kxhhr\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx\" (UID: \"2f2522c5-4bf3-4d82-af9a-546abdb6c4be\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.815312 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f2522c5-4bf3-4d82-af9a-546abdb6c4be-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx\" (UID: \"2f2522c5-4bf3-4d82-af9a-546abdb6c4be\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.815379 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f2522c5-4bf3-4d82-af9a-546abdb6c4be-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx\" (UID: \"2f2522c5-4bf3-4d82-af9a-546abdb6c4be\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.815455 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxhhr\" (UniqueName: \"kubernetes.io/projected/2f2522c5-4bf3-4d82-af9a-546abdb6c4be-kube-api-access-kxhhr\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx\" (UID: \"2f2522c5-4bf3-4d82-af9a-546abdb6c4be\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.815556 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2f2522c5-4bf3-4d82-af9a-546abdb6c4be-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx\" (UID: \"2f2522c5-4bf3-4d82-af9a-546abdb6c4be\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.820463 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f2522c5-4bf3-4d82-af9a-546abdb6c4be-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx\" (UID: \"2f2522c5-4bf3-4d82-af9a-546abdb6c4be\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.821016 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2f2522c5-4bf3-4d82-af9a-546abdb6c4be-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx\" (UID: \"2f2522c5-4bf3-4d82-af9a-546abdb6c4be\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.821625 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f2522c5-4bf3-4d82-af9a-546abdb6c4be-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx\" (UID: \"2f2522c5-4bf3-4d82-af9a-546abdb6c4be\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.839302 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxhhr\" (UniqueName: \"kubernetes.io/projected/2f2522c5-4bf3-4d82-af9a-546abdb6c4be-kube-api-access-kxhhr\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx\" (UID: \"2f2522c5-4bf3-4d82-af9a-546abdb6c4be\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx" Oct 01 12:58:45 crc kubenswrapper[4727]: I1001 12:58:45.931957 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx" Oct 01 12:58:46 crc kubenswrapper[4727]: I1001 12:58:46.448965 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx"] Oct 01 12:58:46 crc kubenswrapper[4727]: I1001 12:58:46.556948 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx" event={"ID":"2f2522c5-4bf3-4d82-af9a-546abdb6c4be","Type":"ContainerStarted","Data":"f57de7ca028f9961eff7ee4703e5b61e91ad814ac42a666e092b8d6f9f282e9d"} Oct 01 12:58:47 crc kubenswrapper[4727]: I1001 12:58:47.567025 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx" event={"ID":"2f2522c5-4bf3-4d82-af9a-546abdb6c4be","Type":"ContainerStarted","Data":"30ef675a8d7849d85d0ff2aa6e06a8dd8b7a65ac448ed727f82747a27d041ff4"} Oct 01 12:58:47 crc kubenswrapper[4727]: I1001 12:58:47.582474 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx" podStartSLOduration=1.9718789810000001 podStartE2EDuration="2.582456544s" podCreationTimestamp="2025-10-01 12:58:45 +0000 UTC" firstStartedPulling="2025-10-01 12:58:46.462934839 +0000 UTC m=+1304.784289676" lastFinishedPulling="2025-10-01 12:58:47.073512402 +0000 UTC m=+1305.394867239" observedRunningTime="2025-10-01 12:58:47.578121628 +0000 UTC m=+1305.899476465" watchObservedRunningTime="2025-10-01 12:58:47.582456544 +0000 UTC m=+1305.903811381" Oct 01 12:58:54 crc kubenswrapper[4727]: E1001 12:58:54.895505 4727 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode19b79e9_2913_42e3_9257_b2750475ace3.slice/crio-ccecdc02f750c6a75fc26df32cc08fafddad6a99c27fdd9e2963edc42687ff49\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode19b79e9_2913_42e3_9257_b2750475ace3.slice\": RecentStats: unable to find data in memory cache]" Oct 01 12:59:03 crc kubenswrapper[4727]: I1001 12:59:03.291785 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:59:03 crc kubenswrapper[4727]: I1001 12:59:03.292902 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:59:09 crc kubenswrapper[4727]: I1001 12:59:09.038903 4727 scope.go:117] "RemoveContainer" containerID="26a4427b3af7f6c00db2becc7c9b767e17e057e4a77167a0cd5c582227845a3c" Oct 01 12:59:09 crc kubenswrapper[4727]: I1001 12:59:09.080407 4727 scope.go:117] "RemoveContainer" containerID="f07080fd9430a01d2c99c0c3de32e8c23556bc26cf9169584c1f40a6ac15656f" Oct 01 12:59:09 crc kubenswrapper[4727]: I1001 12:59:09.112968 4727 scope.go:117] "RemoveContainer" containerID="a4603e5caddbf7627c75b0b98e350ee46f9285b810ec96e9169baf930f9cec2d" Oct 01 12:59:33 crc kubenswrapper[4727]: I1001 12:59:33.291456 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 12:59:33 crc kubenswrapper[4727]: I1001 12:59:33.292073 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 12:59:33 crc kubenswrapper[4727]: I1001 12:59:33.292148 4727 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" Oct 01 12:59:33 crc kubenswrapper[4727]: I1001 12:59:33.292875 4727 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cacd2a9209dd857fc1890a57e560a24e6efca70576638e54f6197ee82d5463f5"} pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 12:59:33 crc kubenswrapper[4727]: I1001 12:59:33.292942 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" containerID="cri-o://cacd2a9209dd857fc1890a57e560a24e6efca70576638e54f6197ee82d5463f5" gracePeriod=600 Oct 01 12:59:34 crc kubenswrapper[4727]: I1001 12:59:34.005435 4727 generic.go:334] "Generic (PLEG): container finished" podID="d18290ae-64a5-44a5-a704-90977d85852b" containerID="cacd2a9209dd857fc1890a57e560a24e6efca70576638e54f6197ee82d5463f5" exitCode=0 Oct 01 12:59:34 crc kubenswrapper[4727]: I1001 12:59:34.005587 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" event={"ID":"d18290ae-64a5-44a5-a704-90977d85852b","Type":"ContainerDied","Data":"cacd2a9209dd857fc1890a57e560a24e6efca70576638e54f6197ee82d5463f5"} Oct 01 12:59:34 crc kubenswrapper[4727]: I1001 12:59:34.005873 4727 scope.go:117] "RemoveContainer" containerID="d15726f80d85ac871118ff8508f8fbb90331c1d082df7e96a9adc970ffc70f86" Oct 01 12:59:35 crc kubenswrapper[4727]: I1001 12:59:35.014906 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" event={"ID":"d18290ae-64a5-44a5-a704-90977d85852b","Type":"ContainerStarted","Data":"34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6"} Oct 01 13:00:00 crc kubenswrapper[4727]: I1001 13:00:00.155667 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322060-wvpsz"] Oct 01 13:00:00 crc kubenswrapper[4727]: I1001 13:00:00.157850 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-wvpsz" Oct 01 13:00:00 crc kubenswrapper[4727]: I1001 13:00:00.160666 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 13:00:00 crc kubenswrapper[4727]: I1001 13:00:00.160932 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 13:00:00 crc kubenswrapper[4727]: I1001 13:00:00.165975 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322060-wvpsz"] Oct 01 13:00:00 crc kubenswrapper[4727]: I1001 13:00:00.221404 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75c2cce7-3476-45f2-9274-4783a1273160-config-volume\") pod \"collect-profiles-29322060-wvpsz\" (UID: \"75c2cce7-3476-45f2-9274-4783a1273160\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-wvpsz" Oct 01 13:00:00 crc kubenswrapper[4727]: I1001 13:00:00.221572 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxjsz\" (UniqueName: \"kubernetes.io/projected/75c2cce7-3476-45f2-9274-4783a1273160-kube-api-access-bxjsz\") pod \"collect-profiles-29322060-wvpsz\" (UID: \"75c2cce7-3476-45f2-9274-4783a1273160\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-wvpsz" Oct 01 13:00:00 crc kubenswrapper[4727]: I1001 13:00:00.221614 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75c2cce7-3476-45f2-9274-4783a1273160-secret-volume\") pod \"collect-profiles-29322060-wvpsz\" (UID: \"75c2cce7-3476-45f2-9274-4783a1273160\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-wvpsz" Oct 01 13:00:00 crc kubenswrapper[4727]: I1001 13:00:00.323938 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75c2cce7-3476-45f2-9274-4783a1273160-config-volume\") pod \"collect-profiles-29322060-wvpsz\" (UID: \"75c2cce7-3476-45f2-9274-4783a1273160\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-wvpsz" Oct 01 13:00:00 crc kubenswrapper[4727]: I1001 13:00:00.324527 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxjsz\" (UniqueName: \"kubernetes.io/projected/75c2cce7-3476-45f2-9274-4783a1273160-kube-api-access-bxjsz\") pod \"collect-profiles-29322060-wvpsz\" (UID: \"75c2cce7-3476-45f2-9274-4783a1273160\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-wvpsz" Oct 01 13:00:00 crc kubenswrapper[4727]: I1001 13:00:00.324709 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75c2cce7-3476-45f2-9274-4783a1273160-secret-volume\") pod \"collect-profiles-29322060-wvpsz\" (UID: \"75c2cce7-3476-45f2-9274-4783a1273160\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-wvpsz" Oct 01 13:00:00 crc kubenswrapper[4727]: I1001 13:00:00.325061 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75c2cce7-3476-45f2-9274-4783a1273160-config-volume\") pod \"collect-profiles-29322060-wvpsz\" (UID: \"75c2cce7-3476-45f2-9274-4783a1273160\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-wvpsz" Oct 01 13:00:00 crc kubenswrapper[4727]: I1001 13:00:00.338098 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75c2cce7-3476-45f2-9274-4783a1273160-secret-volume\") pod \"collect-profiles-29322060-wvpsz\" (UID: \"75c2cce7-3476-45f2-9274-4783a1273160\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-wvpsz" Oct 01 13:00:00 crc kubenswrapper[4727]: I1001 13:00:00.341368 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxjsz\" (UniqueName: \"kubernetes.io/projected/75c2cce7-3476-45f2-9274-4783a1273160-kube-api-access-bxjsz\") pod \"collect-profiles-29322060-wvpsz\" (UID: \"75c2cce7-3476-45f2-9274-4783a1273160\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-wvpsz" Oct 01 13:00:00 crc kubenswrapper[4727]: I1001 13:00:00.487274 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-wvpsz" Oct 01 13:00:00 crc kubenswrapper[4727]: I1001 13:00:00.939546 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322060-wvpsz"] Oct 01 13:00:01 crc kubenswrapper[4727]: I1001 13:00:01.280527 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-wvpsz" event={"ID":"75c2cce7-3476-45f2-9274-4783a1273160","Type":"ContainerStarted","Data":"5d3557c1180b8d5624ddef95ed1bf0850069c36c93e1d1409d0655aa3db78179"} Oct 01 13:00:01 crc kubenswrapper[4727]: I1001 13:00:01.281179 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-wvpsz" event={"ID":"75c2cce7-3476-45f2-9274-4783a1273160","Type":"ContainerStarted","Data":"3e43788ec1f8601a3b3659e63d328f279411a09ea8a59f11ac697b14a10ee49f"} Oct 01 13:00:01 crc kubenswrapper[4727]: I1001 13:00:01.307896 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-wvpsz" podStartSLOduration=1.307880164 podStartE2EDuration="1.307880164s" podCreationTimestamp="2025-10-01 13:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:00:01.303896918 +0000 UTC m=+1379.625251775" watchObservedRunningTime="2025-10-01 13:00:01.307880164 +0000 UTC m=+1379.629234991" Oct 01 13:00:02 crc kubenswrapper[4727]: I1001 13:00:02.298304 4727 generic.go:334] "Generic (PLEG): container finished" podID="75c2cce7-3476-45f2-9274-4783a1273160" containerID="5d3557c1180b8d5624ddef95ed1bf0850069c36c93e1d1409d0655aa3db78179" exitCode=0 Oct 01 13:00:02 crc kubenswrapper[4727]: I1001 13:00:02.298408 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-wvpsz" event={"ID":"75c2cce7-3476-45f2-9274-4783a1273160","Type":"ContainerDied","Data":"5d3557c1180b8d5624ddef95ed1bf0850069c36c93e1d1409d0655aa3db78179"} Oct 01 13:00:03 crc kubenswrapper[4727]: I1001 13:00:03.633288 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-wvpsz" Oct 01 13:00:03 crc kubenswrapper[4727]: I1001 13:00:03.698770 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75c2cce7-3476-45f2-9274-4783a1273160-config-volume\") pod \"75c2cce7-3476-45f2-9274-4783a1273160\" (UID: \"75c2cce7-3476-45f2-9274-4783a1273160\") " Oct 01 13:00:03 crc kubenswrapper[4727]: I1001 13:00:03.699226 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75c2cce7-3476-45f2-9274-4783a1273160-secret-volume\") pod \"75c2cce7-3476-45f2-9274-4783a1273160\" (UID: \"75c2cce7-3476-45f2-9274-4783a1273160\") " Oct 01 13:00:03 crc kubenswrapper[4727]: I1001 13:00:03.699251 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxjsz\" (UniqueName: \"kubernetes.io/projected/75c2cce7-3476-45f2-9274-4783a1273160-kube-api-access-bxjsz\") pod \"75c2cce7-3476-45f2-9274-4783a1273160\" (UID: \"75c2cce7-3476-45f2-9274-4783a1273160\") " Oct 01 13:00:03 crc kubenswrapper[4727]: I1001 13:00:03.699764 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75c2cce7-3476-45f2-9274-4783a1273160-config-volume" (OuterVolumeSpecName: "config-volume") pod "75c2cce7-3476-45f2-9274-4783a1273160" (UID: "75c2cce7-3476-45f2-9274-4783a1273160"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:00:03 crc kubenswrapper[4727]: I1001 13:00:03.705191 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75c2cce7-3476-45f2-9274-4783a1273160-kube-api-access-bxjsz" (OuterVolumeSpecName: "kube-api-access-bxjsz") pod "75c2cce7-3476-45f2-9274-4783a1273160" (UID: "75c2cce7-3476-45f2-9274-4783a1273160"). InnerVolumeSpecName "kube-api-access-bxjsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:00:03 crc kubenswrapper[4727]: I1001 13:00:03.705394 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75c2cce7-3476-45f2-9274-4783a1273160-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "75c2cce7-3476-45f2-9274-4783a1273160" (UID: "75c2cce7-3476-45f2-9274-4783a1273160"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:00:03 crc kubenswrapper[4727]: I1001 13:00:03.802190 4727 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75c2cce7-3476-45f2-9274-4783a1273160-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:00:03 crc kubenswrapper[4727]: I1001 13:00:03.802235 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxjsz\" (UniqueName: \"kubernetes.io/projected/75c2cce7-3476-45f2-9274-4783a1273160-kube-api-access-bxjsz\") on node \"crc\" DevicePath \"\"" Oct 01 13:00:03 crc kubenswrapper[4727]: I1001 13:00:03.802249 4727 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75c2cce7-3476-45f2-9274-4783a1273160-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:00:04 crc kubenswrapper[4727]: I1001 13:00:04.327636 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-wvpsz" event={"ID":"75c2cce7-3476-45f2-9274-4783a1273160","Type":"ContainerDied","Data":"3e43788ec1f8601a3b3659e63d328f279411a09ea8a59f11ac697b14a10ee49f"} Oct 01 13:00:04 crc kubenswrapper[4727]: I1001 13:00:04.327682 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e43788ec1f8601a3b3659e63d328f279411a09ea8a59f11ac697b14a10ee49f" Oct 01 13:00:04 crc kubenswrapper[4727]: I1001 13:00:04.327708 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-wvpsz" Oct 01 13:00:09 crc kubenswrapper[4727]: I1001 13:00:09.235564 4727 scope.go:117] "RemoveContainer" containerID="759006ab7302535a7223715dd8e5b23bbc8d0cd4ded5a1c343658428088b08a8" Oct 01 13:00:09 crc kubenswrapper[4727]: I1001 13:00:09.260252 4727 scope.go:117] "RemoveContainer" containerID="510d5edffe1462f2a9bc5254c8aa0f392fde857c01445c509f6c935e4839da71" Oct 01 13:00:33 crc kubenswrapper[4727]: I1001 13:00:33.646472 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pvv5n"] Oct 01 13:00:33 crc kubenswrapper[4727]: E1001 13:00:33.656294 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75c2cce7-3476-45f2-9274-4783a1273160" containerName="collect-profiles" Oct 01 13:00:33 crc kubenswrapper[4727]: I1001 13:00:33.656411 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="75c2cce7-3476-45f2-9274-4783a1273160" containerName="collect-profiles" Oct 01 13:00:33 crc kubenswrapper[4727]: I1001 13:00:33.656719 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="75c2cce7-3476-45f2-9274-4783a1273160" containerName="collect-profiles" Oct 01 13:00:33 crc kubenswrapper[4727]: I1001 13:00:33.658518 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pvv5n" Oct 01 13:00:33 crc kubenswrapper[4727]: I1001 13:00:33.674852 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pvv5n"] Oct 01 13:00:33 crc kubenswrapper[4727]: I1001 13:00:33.674906 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrf59\" (UniqueName: \"kubernetes.io/projected/0a8f26dd-f951-4817-92d7-1fab14bc4990-kube-api-access-vrf59\") pod \"certified-operators-pvv5n\" (UID: \"0a8f26dd-f951-4817-92d7-1fab14bc4990\") " pod="openshift-marketplace/certified-operators-pvv5n" Oct 01 13:00:33 crc kubenswrapper[4727]: I1001 13:00:33.674954 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8f26dd-f951-4817-92d7-1fab14bc4990-utilities\") pod \"certified-operators-pvv5n\" (UID: \"0a8f26dd-f951-4817-92d7-1fab14bc4990\") " pod="openshift-marketplace/certified-operators-pvv5n" Oct 01 13:00:33 crc kubenswrapper[4727]: I1001 13:00:33.675179 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8f26dd-f951-4817-92d7-1fab14bc4990-catalog-content\") pod \"certified-operators-pvv5n\" (UID: \"0a8f26dd-f951-4817-92d7-1fab14bc4990\") " pod="openshift-marketplace/certified-operators-pvv5n" Oct 01 13:00:33 crc kubenswrapper[4727]: I1001 13:00:33.776466 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8f26dd-f951-4817-92d7-1fab14bc4990-catalog-content\") pod \"certified-operators-pvv5n\" (UID: \"0a8f26dd-f951-4817-92d7-1fab14bc4990\") " pod="openshift-marketplace/certified-operators-pvv5n" Oct 01 13:00:33 crc kubenswrapper[4727]: I1001 13:00:33.776603 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrf59\" (UniqueName: \"kubernetes.io/projected/0a8f26dd-f951-4817-92d7-1fab14bc4990-kube-api-access-vrf59\") pod \"certified-operators-pvv5n\" (UID: \"0a8f26dd-f951-4817-92d7-1fab14bc4990\") " pod="openshift-marketplace/certified-operators-pvv5n" Oct 01 13:00:33 crc kubenswrapper[4727]: I1001 13:00:33.776631 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8f26dd-f951-4817-92d7-1fab14bc4990-utilities\") pod \"certified-operators-pvv5n\" (UID: \"0a8f26dd-f951-4817-92d7-1fab14bc4990\") " pod="openshift-marketplace/certified-operators-pvv5n" Oct 01 13:00:33 crc kubenswrapper[4727]: I1001 13:00:33.776968 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8f26dd-f951-4817-92d7-1fab14bc4990-catalog-content\") pod \"certified-operators-pvv5n\" (UID: \"0a8f26dd-f951-4817-92d7-1fab14bc4990\") " pod="openshift-marketplace/certified-operators-pvv5n" Oct 01 13:00:33 crc kubenswrapper[4727]: I1001 13:00:33.777035 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8f26dd-f951-4817-92d7-1fab14bc4990-utilities\") pod \"certified-operators-pvv5n\" (UID: \"0a8f26dd-f951-4817-92d7-1fab14bc4990\") " pod="openshift-marketplace/certified-operators-pvv5n" Oct 01 13:00:33 crc kubenswrapper[4727]: I1001 13:00:33.798403 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrf59\" (UniqueName: \"kubernetes.io/projected/0a8f26dd-f951-4817-92d7-1fab14bc4990-kube-api-access-vrf59\") pod \"certified-operators-pvv5n\" (UID: \"0a8f26dd-f951-4817-92d7-1fab14bc4990\") " pod="openshift-marketplace/certified-operators-pvv5n" Oct 01 13:00:33 crc kubenswrapper[4727]: I1001 13:00:33.990697 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pvv5n" Oct 01 13:00:34 crc kubenswrapper[4727]: I1001 13:00:34.480114 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pvv5n"] Oct 01 13:00:34 crc kubenswrapper[4727]: W1001 13:00:34.484738 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a8f26dd_f951_4817_92d7_1fab14bc4990.slice/crio-b35c38c26dd992163c50c2c3c6e57f4fac1d37e61c967bfc40326cf4ab6cee99 WatchSource:0}: Error finding container b35c38c26dd992163c50c2c3c6e57f4fac1d37e61c967bfc40326cf4ab6cee99: Status 404 returned error can't find the container with id b35c38c26dd992163c50c2c3c6e57f4fac1d37e61c967bfc40326cf4ab6cee99 Oct 01 13:00:34 crc kubenswrapper[4727]: I1001 13:00:34.620552 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pvv5n" event={"ID":"0a8f26dd-f951-4817-92d7-1fab14bc4990","Type":"ContainerStarted","Data":"b35c38c26dd992163c50c2c3c6e57f4fac1d37e61c967bfc40326cf4ab6cee99"} Oct 01 13:00:35 crc kubenswrapper[4727]: I1001 13:00:35.632449 4727 generic.go:334] "Generic (PLEG): container finished" podID="0a8f26dd-f951-4817-92d7-1fab14bc4990" containerID="4c6aca7adee8369b3fde60efb2a8c0d67838e4fe2118efeae33a209e23d5d79f" exitCode=0 Oct 01 13:00:35 crc kubenswrapper[4727]: I1001 13:00:35.632508 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pvv5n" event={"ID":"0a8f26dd-f951-4817-92d7-1fab14bc4990","Type":"ContainerDied","Data":"4c6aca7adee8369b3fde60efb2a8c0d67838e4fe2118efeae33a209e23d5d79f"} Oct 01 13:00:37 crc kubenswrapper[4727]: I1001 13:00:37.663527 4727 generic.go:334] "Generic (PLEG): container finished" podID="0a8f26dd-f951-4817-92d7-1fab14bc4990" containerID="48c83e2b947642ed07e5b85cfa617320a7543387f7121b354b9d034308815d2d" exitCode=0 Oct 01 13:00:37 crc kubenswrapper[4727]: I1001 13:00:37.663587 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pvv5n" event={"ID":"0a8f26dd-f951-4817-92d7-1fab14bc4990","Type":"ContainerDied","Data":"48c83e2b947642ed07e5b85cfa617320a7543387f7121b354b9d034308815d2d"} Oct 01 13:00:38 crc kubenswrapper[4727]: I1001 13:00:38.679496 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pvv5n" event={"ID":"0a8f26dd-f951-4817-92d7-1fab14bc4990","Type":"ContainerStarted","Data":"a517217252257e6706e643e02f417f165e6289fb2c170376b86ef7bee3ad5a7c"} Oct 01 13:00:38 crc kubenswrapper[4727]: I1001 13:00:38.708510 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pvv5n" podStartSLOduration=3.152915095 podStartE2EDuration="5.70849085s" podCreationTimestamp="2025-10-01 13:00:33 +0000 UTC" firstStartedPulling="2025-10-01 13:00:35.634393285 +0000 UTC m=+1413.955748122" lastFinishedPulling="2025-10-01 13:00:38.18996904 +0000 UTC m=+1416.511323877" observedRunningTime="2025-10-01 13:00:38.701240582 +0000 UTC m=+1417.022595439" watchObservedRunningTime="2025-10-01 13:00:38.70849085 +0000 UTC m=+1417.029845687" Oct 01 13:00:43 crc kubenswrapper[4727]: I1001 13:00:43.991528 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pvv5n" Oct 01 13:00:43 crc kubenswrapper[4727]: I1001 13:00:43.992185 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pvv5n" Oct 01 13:00:44 crc kubenswrapper[4727]: I1001 13:00:44.041337 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pvv5n" Oct 01 13:00:44 crc kubenswrapper[4727]: I1001 13:00:44.782967 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pvv5n" Oct 01 13:00:44 crc kubenswrapper[4727]: I1001 13:00:44.880839 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pvv5n"] Oct 01 13:00:46 crc kubenswrapper[4727]: I1001 13:00:46.750392 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pvv5n" podUID="0a8f26dd-f951-4817-92d7-1fab14bc4990" containerName="registry-server" containerID="cri-o://a517217252257e6706e643e02f417f165e6289fb2c170376b86ef7bee3ad5a7c" gracePeriod=2 Oct 01 13:00:47 crc kubenswrapper[4727]: I1001 13:00:47.284967 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pvv5n" Oct 01 13:00:47 crc kubenswrapper[4727]: I1001 13:00:47.356669 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8f26dd-f951-4817-92d7-1fab14bc4990-utilities\") pod \"0a8f26dd-f951-4817-92d7-1fab14bc4990\" (UID: \"0a8f26dd-f951-4817-92d7-1fab14bc4990\") " Oct 01 13:00:47 crc kubenswrapper[4727]: I1001 13:00:47.356928 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8f26dd-f951-4817-92d7-1fab14bc4990-catalog-content\") pod \"0a8f26dd-f951-4817-92d7-1fab14bc4990\" (UID: \"0a8f26dd-f951-4817-92d7-1fab14bc4990\") " Oct 01 13:00:47 crc kubenswrapper[4727]: I1001 13:00:47.356962 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrf59\" (UniqueName: \"kubernetes.io/projected/0a8f26dd-f951-4817-92d7-1fab14bc4990-kube-api-access-vrf59\") pod \"0a8f26dd-f951-4817-92d7-1fab14bc4990\" (UID: \"0a8f26dd-f951-4817-92d7-1fab14bc4990\") " Oct 01 13:00:47 crc kubenswrapper[4727]: I1001 13:00:47.358887 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a8f26dd-f951-4817-92d7-1fab14bc4990-utilities" (OuterVolumeSpecName: "utilities") pod "0a8f26dd-f951-4817-92d7-1fab14bc4990" (UID: "0a8f26dd-f951-4817-92d7-1fab14bc4990"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:00:47 crc kubenswrapper[4727]: I1001 13:00:47.365756 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a8f26dd-f951-4817-92d7-1fab14bc4990-kube-api-access-vrf59" (OuterVolumeSpecName: "kube-api-access-vrf59") pod "0a8f26dd-f951-4817-92d7-1fab14bc4990" (UID: "0a8f26dd-f951-4817-92d7-1fab14bc4990"). InnerVolumeSpecName "kube-api-access-vrf59". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:00:47 crc kubenswrapper[4727]: I1001 13:00:47.405460 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a8f26dd-f951-4817-92d7-1fab14bc4990-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0a8f26dd-f951-4817-92d7-1fab14bc4990" (UID: "0a8f26dd-f951-4817-92d7-1fab14bc4990"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:00:47 crc kubenswrapper[4727]: I1001 13:00:47.459093 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8f26dd-f951-4817-92d7-1fab14bc4990-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:00:47 crc kubenswrapper[4727]: I1001 13:00:47.459367 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8f26dd-f951-4817-92d7-1fab14bc4990-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:00:47 crc kubenswrapper[4727]: I1001 13:00:47.459459 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrf59\" (UniqueName: \"kubernetes.io/projected/0a8f26dd-f951-4817-92d7-1fab14bc4990-kube-api-access-vrf59\") on node \"crc\" DevicePath \"\"" Oct 01 13:00:47 crc kubenswrapper[4727]: I1001 13:00:47.759823 4727 generic.go:334] "Generic (PLEG): container finished" podID="0a8f26dd-f951-4817-92d7-1fab14bc4990" containerID="a517217252257e6706e643e02f417f165e6289fb2c170376b86ef7bee3ad5a7c" exitCode=0 Oct 01 13:00:47 crc kubenswrapper[4727]: I1001 13:00:47.759872 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pvv5n" event={"ID":"0a8f26dd-f951-4817-92d7-1fab14bc4990","Type":"ContainerDied","Data":"a517217252257e6706e643e02f417f165e6289fb2c170376b86ef7bee3ad5a7c"} Oct 01 13:00:47 crc kubenswrapper[4727]: I1001 13:00:47.760229 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pvv5n" event={"ID":"0a8f26dd-f951-4817-92d7-1fab14bc4990","Type":"ContainerDied","Data":"b35c38c26dd992163c50c2c3c6e57f4fac1d37e61c967bfc40326cf4ab6cee99"} Oct 01 13:00:47 crc kubenswrapper[4727]: I1001 13:00:47.760263 4727 scope.go:117] "RemoveContainer" containerID="a517217252257e6706e643e02f417f165e6289fb2c170376b86ef7bee3ad5a7c" Oct 01 13:00:47 crc kubenswrapper[4727]: I1001 13:00:47.759942 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pvv5n" Oct 01 13:00:47 crc kubenswrapper[4727]: I1001 13:00:47.789547 4727 scope.go:117] "RemoveContainer" containerID="48c83e2b947642ed07e5b85cfa617320a7543387f7121b354b9d034308815d2d" Oct 01 13:00:47 crc kubenswrapper[4727]: I1001 13:00:47.797023 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pvv5n"] Oct 01 13:00:47 crc kubenswrapper[4727]: I1001 13:00:47.805238 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pvv5n"] Oct 01 13:00:47 crc kubenswrapper[4727]: I1001 13:00:47.817129 4727 scope.go:117] "RemoveContainer" containerID="4c6aca7adee8369b3fde60efb2a8c0d67838e4fe2118efeae33a209e23d5d79f" Oct 01 13:00:47 crc kubenswrapper[4727]: I1001 13:00:47.860692 4727 scope.go:117] "RemoveContainer" containerID="a517217252257e6706e643e02f417f165e6289fb2c170376b86ef7bee3ad5a7c" Oct 01 13:00:47 crc kubenswrapper[4727]: E1001 13:00:47.861157 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a517217252257e6706e643e02f417f165e6289fb2c170376b86ef7bee3ad5a7c\": container with ID starting with a517217252257e6706e643e02f417f165e6289fb2c170376b86ef7bee3ad5a7c not found: ID does not exist" containerID="a517217252257e6706e643e02f417f165e6289fb2c170376b86ef7bee3ad5a7c" Oct 01 13:00:47 crc kubenswrapper[4727]: I1001 13:00:47.861198 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a517217252257e6706e643e02f417f165e6289fb2c170376b86ef7bee3ad5a7c"} err="failed to get container status \"a517217252257e6706e643e02f417f165e6289fb2c170376b86ef7bee3ad5a7c\": rpc error: code = NotFound desc = could not find container \"a517217252257e6706e643e02f417f165e6289fb2c170376b86ef7bee3ad5a7c\": container with ID starting with a517217252257e6706e643e02f417f165e6289fb2c170376b86ef7bee3ad5a7c not found: ID does not exist" Oct 01 13:00:47 crc kubenswrapper[4727]: I1001 13:00:47.861358 4727 scope.go:117] "RemoveContainer" containerID="48c83e2b947642ed07e5b85cfa617320a7543387f7121b354b9d034308815d2d" Oct 01 13:00:47 crc kubenswrapper[4727]: E1001 13:00:47.861788 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48c83e2b947642ed07e5b85cfa617320a7543387f7121b354b9d034308815d2d\": container with ID starting with 48c83e2b947642ed07e5b85cfa617320a7543387f7121b354b9d034308815d2d not found: ID does not exist" containerID="48c83e2b947642ed07e5b85cfa617320a7543387f7121b354b9d034308815d2d" Oct 01 13:00:47 crc kubenswrapper[4727]: I1001 13:00:47.861819 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48c83e2b947642ed07e5b85cfa617320a7543387f7121b354b9d034308815d2d"} err="failed to get container status \"48c83e2b947642ed07e5b85cfa617320a7543387f7121b354b9d034308815d2d\": rpc error: code = NotFound desc = could not find container \"48c83e2b947642ed07e5b85cfa617320a7543387f7121b354b9d034308815d2d\": container with ID starting with 48c83e2b947642ed07e5b85cfa617320a7543387f7121b354b9d034308815d2d not found: ID does not exist" Oct 01 13:00:47 crc kubenswrapper[4727]: I1001 13:00:47.861837 4727 scope.go:117] "RemoveContainer" containerID="4c6aca7adee8369b3fde60efb2a8c0d67838e4fe2118efeae33a209e23d5d79f" Oct 01 13:00:47 crc kubenswrapper[4727]: E1001 13:00:47.862273 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c6aca7adee8369b3fde60efb2a8c0d67838e4fe2118efeae33a209e23d5d79f\": container with ID starting with 4c6aca7adee8369b3fde60efb2a8c0d67838e4fe2118efeae33a209e23d5d79f not found: ID does not exist" containerID="4c6aca7adee8369b3fde60efb2a8c0d67838e4fe2118efeae33a209e23d5d79f" Oct 01 13:00:47 crc kubenswrapper[4727]: I1001 13:00:47.862298 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c6aca7adee8369b3fde60efb2a8c0d67838e4fe2118efeae33a209e23d5d79f"} err="failed to get container status \"4c6aca7adee8369b3fde60efb2a8c0d67838e4fe2118efeae33a209e23d5d79f\": rpc error: code = NotFound desc = could not find container \"4c6aca7adee8369b3fde60efb2a8c0d67838e4fe2118efeae33a209e23d5d79f\": container with ID starting with 4c6aca7adee8369b3fde60efb2a8c0d67838e4fe2118efeae33a209e23d5d79f not found: ID does not exist" Oct 01 13:00:48 crc kubenswrapper[4727]: I1001 13:00:48.388109 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a8f26dd-f951-4817-92d7-1fab14bc4990" path="/var/lib/kubelet/pods/0a8f26dd-f951-4817-92d7-1fab14bc4990/volumes" Oct 01 13:00:59 crc kubenswrapper[4727]: I1001 13:00:59.147030 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6hqwq"] Oct 01 13:00:59 crc kubenswrapper[4727]: E1001 13:00:59.149609 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a8f26dd-f951-4817-92d7-1fab14bc4990" containerName="registry-server" Oct 01 13:00:59 crc kubenswrapper[4727]: I1001 13:00:59.149917 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8f26dd-f951-4817-92d7-1fab14bc4990" containerName="registry-server" Oct 01 13:00:59 crc kubenswrapper[4727]: E1001 13:00:59.151465 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a8f26dd-f951-4817-92d7-1fab14bc4990" containerName="extract-utilities" Oct 01 13:00:59 crc kubenswrapper[4727]: I1001 13:00:59.151614 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8f26dd-f951-4817-92d7-1fab14bc4990" containerName="extract-utilities" Oct 01 13:00:59 crc kubenswrapper[4727]: E1001 13:00:59.151779 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a8f26dd-f951-4817-92d7-1fab14bc4990" containerName="extract-content" Oct 01 13:00:59 crc kubenswrapper[4727]: I1001 13:00:59.151862 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8f26dd-f951-4817-92d7-1fab14bc4990" containerName="extract-content" Oct 01 13:00:59 crc kubenswrapper[4727]: I1001 13:00:59.152304 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a8f26dd-f951-4817-92d7-1fab14bc4990" containerName="registry-server" Oct 01 13:00:59 crc kubenswrapper[4727]: I1001 13:00:59.154082 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6hqwq" Oct 01 13:00:59 crc kubenswrapper[4727]: I1001 13:00:59.165798 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6hqwq"] Oct 01 13:00:59 crc kubenswrapper[4727]: I1001 13:00:59.206677 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a2c105e-c653-4055-ab64-4c8847d8df08-utilities\") pod \"redhat-marketplace-6hqwq\" (UID: \"5a2c105e-c653-4055-ab64-4c8847d8df08\") " pod="openshift-marketplace/redhat-marketplace-6hqwq" Oct 01 13:00:59 crc kubenswrapper[4727]: I1001 13:00:59.207220 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a2c105e-c653-4055-ab64-4c8847d8df08-catalog-content\") pod \"redhat-marketplace-6hqwq\" (UID: \"5a2c105e-c653-4055-ab64-4c8847d8df08\") " pod="openshift-marketplace/redhat-marketplace-6hqwq" Oct 01 13:00:59 crc kubenswrapper[4727]: I1001 13:00:59.207411 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ss48\" (UniqueName: \"kubernetes.io/projected/5a2c105e-c653-4055-ab64-4c8847d8df08-kube-api-access-8ss48\") pod \"redhat-marketplace-6hqwq\" (UID: \"5a2c105e-c653-4055-ab64-4c8847d8df08\") " pod="openshift-marketplace/redhat-marketplace-6hqwq" Oct 01 13:00:59 crc kubenswrapper[4727]: I1001 13:00:59.308775 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ss48\" (UniqueName: \"kubernetes.io/projected/5a2c105e-c653-4055-ab64-4c8847d8df08-kube-api-access-8ss48\") pod \"redhat-marketplace-6hqwq\" (UID: \"5a2c105e-c653-4055-ab64-4c8847d8df08\") " pod="openshift-marketplace/redhat-marketplace-6hqwq" Oct 01 13:00:59 crc kubenswrapper[4727]: I1001 13:00:59.309171 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a2c105e-c653-4055-ab64-4c8847d8df08-utilities\") pod \"redhat-marketplace-6hqwq\" (UID: \"5a2c105e-c653-4055-ab64-4c8847d8df08\") " pod="openshift-marketplace/redhat-marketplace-6hqwq" Oct 01 13:00:59 crc kubenswrapper[4727]: I1001 13:00:59.309363 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a2c105e-c653-4055-ab64-4c8847d8df08-catalog-content\") pod \"redhat-marketplace-6hqwq\" (UID: \"5a2c105e-c653-4055-ab64-4c8847d8df08\") " pod="openshift-marketplace/redhat-marketplace-6hqwq" Oct 01 13:00:59 crc kubenswrapper[4727]: I1001 13:00:59.309821 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a2c105e-c653-4055-ab64-4c8847d8df08-utilities\") pod \"redhat-marketplace-6hqwq\" (UID: \"5a2c105e-c653-4055-ab64-4c8847d8df08\") " pod="openshift-marketplace/redhat-marketplace-6hqwq" Oct 01 13:00:59 crc kubenswrapper[4727]: I1001 13:00:59.309833 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a2c105e-c653-4055-ab64-4c8847d8df08-catalog-content\") pod \"redhat-marketplace-6hqwq\" (UID: \"5a2c105e-c653-4055-ab64-4c8847d8df08\") " pod="openshift-marketplace/redhat-marketplace-6hqwq" Oct 01 13:00:59 crc kubenswrapper[4727]: I1001 13:00:59.329471 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ss48\" (UniqueName: \"kubernetes.io/projected/5a2c105e-c653-4055-ab64-4c8847d8df08-kube-api-access-8ss48\") pod \"redhat-marketplace-6hqwq\" (UID: \"5a2c105e-c653-4055-ab64-4c8847d8df08\") " pod="openshift-marketplace/redhat-marketplace-6hqwq" Oct 01 13:00:59 crc kubenswrapper[4727]: I1001 13:00:59.482388 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6hqwq" Oct 01 13:00:59 crc kubenswrapper[4727]: I1001 13:00:59.989710 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6hqwq"] Oct 01 13:01:00 crc kubenswrapper[4727]: I1001 13:01:00.158619 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29322061-hhrsh"] Oct 01 13:01:00 crc kubenswrapper[4727]: I1001 13:01:00.161765 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322061-hhrsh" Oct 01 13:01:00 crc kubenswrapper[4727]: I1001 13:01:00.174597 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29322061-hhrsh"] Oct 01 13:01:00 crc kubenswrapper[4727]: I1001 13:01:00.229336 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/afedada7-a84e-4fdc-94f1-feb3b93398d1-fernet-keys\") pod \"keystone-cron-29322061-hhrsh\" (UID: \"afedada7-a84e-4fdc-94f1-feb3b93398d1\") " pod="openstack/keystone-cron-29322061-hhrsh" Oct 01 13:01:00 crc kubenswrapper[4727]: I1001 13:01:00.229401 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbzwd\" (UniqueName: \"kubernetes.io/projected/afedada7-a84e-4fdc-94f1-feb3b93398d1-kube-api-access-kbzwd\") pod \"keystone-cron-29322061-hhrsh\" (UID: \"afedada7-a84e-4fdc-94f1-feb3b93398d1\") " pod="openstack/keystone-cron-29322061-hhrsh" Oct 01 13:01:00 crc kubenswrapper[4727]: I1001 13:01:00.229447 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afedada7-a84e-4fdc-94f1-feb3b93398d1-config-data\") pod \"keystone-cron-29322061-hhrsh\" (UID: \"afedada7-a84e-4fdc-94f1-feb3b93398d1\") " pod="openstack/keystone-cron-29322061-hhrsh" Oct 01 13:01:00 crc kubenswrapper[4727]: I1001 13:01:00.229473 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afedada7-a84e-4fdc-94f1-feb3b93398d1-combined-ca-bundle\") pod \"keystone-cron-29322061-hhrsh\" (UID: \"afedada7-a84e-4fdc-94f1-feb3b93398d1\") " pod="openstack/keystone-cron-29322061-hhrsh" Oct 01 13:01:00 crc kubenswrapper[4727]: I1001 13:01:00.331226 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/afedada7-a84e-4fdc-94f1-feb3b93398d1-fernet-keys\") pod \"keystone-cron-29322061-hhrsh\" (UID: \"afedada7-a84e-4fdc-94f1-feb3b93398d1\") " pod="openstack/keystone-cron-29322061-hhrsh" Oct 01 13:01:00 crc kubenswrapper[4727]: I1001 13:01:00.331829 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbzwd\" (UniqueName: \"kubernetes.io/projected/afedada7-a84e-4fdc-94f1-feb3b93398d1-kube-api-access-kbzwd\") pod \"keystone-cron-29322061-hhrsh\" (UID: \"afedada7-a84e-4fdc-94f1-feb3b93398d1\") " pod="openstack/keystone-cron-29322061-hhrsh" Oct 01 13:01:00 crc kubenswrapper[4727]: I1001 13:01:00.332091 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afedada7-a84e-4fdc-94f1-feb3b93398d1-config-data\") pod \"keystone-cron-29322061-hhrsh\" (UID: \"afedada7-a84e-4fdc-94f1-feb3b93398d1\") " pod="openstack/keystone-cron-29322061-hhrsh" Oct 01 13:01:00 crc kubenswrapper[4727]: I1001 13:01:00.332239 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afedada7-a84e-4fdc-94f1-feb3b93398d1-combined-ca-bundle\") pod \"keystone-cron-29322061-hhrsh\" (UID: \"afedada7-a84e-4fdc-94f1-feb3b93398d1\") " pod="openstack/keystone-cron-29322061-hhrsh" Oct 01 13:01:00 crc kubenswrapper[4727]: I1001 13:01:00.340825 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afedada7-a84e-4fdc-94f1-feb3b93398d1-combined-ca-bundle\") pod \"keystone-cron-29322061-hhrsh\" (UID: \"afedada7-a84e-4fdc-94f1-feb3b93398d1\") " pod="openstack/keystone-cron-29322061-hhrsh" Oct 01 13:01:00 crc kubenswrapper[4727]: I1001 13:01:00.340840 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afedada7-a84e-4fdc-94f1-feb3b93398d1-config-data\") pod \"keystone-cron-29322061-hhrsh\" (UID: \"afedada7-a84e-4fdc-94f1-feb3b93398d1\") " pod="openstack/keystone-cron-29322061-hhrsh" Oct 01 13:01:00 crc kubenswrapper[4727]: I1001 13:01:00.340969 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/afedada7-a84e-4fdc-94f1-feb3b93398d1-fernet-keys\") pod \"keystone-cron-29322061-hhrsh\" (UID: \"afedada7-a84e-4fdc-94f1-feb3b93398d1\") " pod="openstack/keystone-cron-29322061-hhrsh" Oct 01 13:01:00 crc kubenswrapper[4727]: I1001 13:01:00.359372 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbzwd\" (UniqueName: \"kubernetes.io/projected/afedada7-a84e-4fdc-94f1-feb3b93398d1-kube-api-access-kbzwd\") pod \"keystone-cron-29322061-hhrsh\" (UID: \"afedada7-a84e-4fdc-94f1-feb3b93398d1\") " pod="openstack/keystone-cron-29322061-hhrsh" Oct 01 13:01:00 crc kubenswrapper[4727]: I1001 13:01:00.493127 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322061-hhrsh" Oct 01 13:01:00 crc kubenswrapper[4727]: I1001 13:01:00.904270 4727 generic.go:334] "Generic (PLEG): container finished" podID="5a2c105e-c653-4055-ab64-4c8847d8df08" containerID="94aa085d26b868d82b74ddad596719d8ced99bb82b963573f30119c1a56cb002" exitCode=0 Oct 01 13:01:00 crc kubenswrapper[4727]: I1001 13:01:00.904597 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6hqwq" event={"ID":"5a2c105e-c653-4055-ab64-4c8847d8df08","Type":"ContainerDied","Data":"94aa085d26b868d82b74ddad596719d8ced99bb82b963573f30119c1a56cb002"} Oct 01 13:01:00 crc kubenswrapper[4727]: I1001 13:01:00.904626 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6hqwq" event={"ID":"5a2c105e-c653-4055-ab64-4c8847d8df08","Type":"ContainerStarted","Data":"0bed27b3e32a4ac8f37ad002cba2681c820d45b257a71614b4ae2aa6868d1fc0"} Oct 01 13:01:00 crc kubenswrapper[4727]: W1001 13:01:00.961500 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafedada7_a84e_4fdc_94f1_feb3b93398d1.slice/crio-1929da01891c396e862a83da2e9a4080b5b29a51ba73b2e725579ad99b698feb WatchSource:0}: Error finding container 1929da01891c396e862a83da2e9a4080b5b29a51ba73b2e725579ad99b698feb: Status 404 returned error can't find the container with id 1929da01891c396e862a83da2e9a4080b5b29a51ba73b2e725579ad99b698feb Oct 01 13:01:00 crc kubenswrapper[4727]: I1001 13:01:00.966117 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29322061-hhrsh"] Oct 01 13:01:01 crc kubenswrapper[4727]: I1001 13:01:01.916200 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6hqwq" event={"ID":"5a2c105e-c653-4055-ab64-4c8847d8df08","Type":"ContainerStarted","Data":"4d0dd7dfdd2d15f6fd4f4211f0bee4d6fc82530e9d6e2a98a71171869fbe975f"} Oct 01 13:01:01 crc kubenswrapper[4727]: I1001 13:01:01.918865 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322061-hhrsh" event={"ID":"afedada7-a84e-4fdc-94f1-feb3b93398d1","Type":"ContainerStarted","Data":"0182619d344e8f30d7c8fcfebe13aee04746d0dcc4e01d8385e52e025dec8ac9"} Oct 01 13:01:01 crc kubenswrapper[4727]: I1001 13:01:01.918946 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322061-hhrsh" event={"ID":"afedada7-a84e-4fdc-94f1-feb3b93398d1","Type":"ContainerStarted","Data":"1929da01891c396e862a83da2e9a4080b5b29a51ba73b2e725579ad99b698feb"} Oct 01 13:01:01 crc kubenswrapper[4727]: I1001 13:01:01.957874 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29322061-hhrsh" podStartSLOduration=1.957852213 podStartE2EDuration="1.957852213s" podCreationTimestamp="2025-10-01 13:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:01:01.949605063 +0000 UTC m=+1440.270959910" watchObservedRunningTime="2025-10-01 13:01:01.957852213 +0000 UTC m=+1440.279207070" Oct 01 13:01:02 crc kubenswrapper[4727]: I1001 13:01:02.942689 4727 generic.go:334] "Generic (PLEG): container finished" podID="5a2c105e-c653-4055-ab64-4c8847d8df08" containerID="4d0dd7dfdd2d15f6fd4f4211f0bee4d6fc82530e9d6e2a98a71171869fbe975f" exitCode=0 Oct 01 13:01:02 crc kubenswrapper[4727]: I1001 13:01:02.944931 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6hqwq" event={"ID":"5a2c105e-c653-4055-ab64-4c8847d8df08","Type":"ContainerDied","Data":"4d0dd7dfdd2d15f6fd4f4211f0bee4d6fc82530e9d6e2a98a71171869fbe975f"} Oct 01 13:01:03 crc kubenswrapper[4727]: I1001 13:01:03.953669 4727 generic.go:334] "Generic (PLEG): container finished" podID="afedada7-a84e-4fdc-94f1-feb3b93398d1" containerID="0182619d344e8f30d7c8fcfebe13aee04746d0dcc4e01d8385e52e025dec8ac9" exitCode=0 Oct 01 13:01:03 crc kubenswrapper[4727]: I1001 13:01:03.953748 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322061-hhrsh" event={"ID":"afedada7-a84e-4fdc-94f1-feb3b93398d1","Type":"ContainerDied","Data":"0182619d344e8f30d7c8fcfebe13aee04746d0dcc4e01d8385e52e025dec8ac9"} Oct 01 13:01:03 crc kubenswrapper[4727]: I1001 13:01:03.956880 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6hqwq" event={"ID":"5a2c105e-c653-4055-ab64-4c8847d8df08","Type":"ContainerStarted","Data":"7be8b97e00f73bab188aea795da0e3b2ad9a4f41adb7341f22cf3da72e508bf3"} Oct 01 13:01:03 crc kubenswrapper[4727]: I1001 13:01:03.999775 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6hqwq" podStartSLOduration=2.518900958 podStartE2EDuration="4.999727648s" podCreationTimestamp="2025-10-01 13:00:59 +0000 UTC" firstStartedPulling="2025-10-01 13:01:00.908520016 +0000 UTC m=+1439.229874853" lastFinishedPulling="2025-10-01 13:01:03.389346686 +0000 UTC m=+1441.710701543" observedRunningTime="2025-10-01 13:01:03.996519458 +0000 UTC m=+1442.317874295" watchObservedRunningTime="2025-10-01 13:01:03.999727648 +0000 UTC m=+1442.321082485" Oct 01 13:01:05 crc kubenswrapper[4727]: I1001 13:01:05.300987 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322061-hhrsh" Oct 01 13:01:05 crc kubenswrapper[4727]: I1001 13:01:05.342527 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbzwd\" (UniqueName: \"kubernetes.io/projected/afedada7-a84e-4fdc-94f1-feb3b93398d1-kube-api-access-kbzwd\") pod \"afedada7-a84e-4fdc-94f1-feb3b93398d1\" (UID: \"afedada7-a84e-4fdc-94f1-feb3b93398d1\") " Oct 01 13:01:05 crc kubenswrapper[4727]: I1001 13:01:05.342675 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afedada7-a84e-4fdc-94f1-feb3b93398d1-combined-ca-bundle\") pod \"afedada7-a84e-4fdc-94f1-feb3b93398d1\" (UID: \"afedada7-a84e-4fdc-94f1-feb3b93398d1\") " Oct 01 13:01:05 crc kubenswrapper[4727]: I1001 13:01:05.342733 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afedada7-a84e-4fdc-94f1-feb3b93398d1-config-data\") pod \"afedada7-a84e-4fdc-94f1-feb3b93398d1\" (UID: \"afedada7-a84e-4fdc-94f1-feb3b93398d1\") " Oct 01 13:01:05 crc kubenswrapper[4727]: I1001 13:01:05.342833 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/afedada7-a84e-4fdc-94f1-feb3b93398d1-fernet-keys\") pod \"afedada7-a84e-4fdc-94f1-feb3b93398d1\" (UID: \"afedada7-a84e-4fdc-94f1-feb3b93398d1\") " Oct 01 13:01:05 crc kubenswrapper[4727]: I1001 13:01:05.349098 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afedada7-a84e-4fdc-94f1-feb3b93398d1-kube-api-access-kbzwd" (OuterVolumeSpecName: "kube-api-access-kbzwd") pod "afedada7-a84e-4fdc-94f1-feb3b93398d1" (UID: "afedada7-a84e-4fdc-94f1-feb3b93398d1"). InnerVolumeSpecName "kube-api-access-kbzwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:01:05 crc kubenswrapper[4727]: I1001 13:01:05.349368 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afedada7-a84e-4fdc-94f1-feb3b93398d1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "afedada7-a84e-4fdc-94f1-feb3b93398d1" (UID: "afedada7-a84e-4fdc-94f1-feb3b93398d1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:01:05 crc kubenswrapper[4727]: I1001 13:01:05.373197 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afedada7-a84e-4fdc-94f1-feb3b93398d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "afedada7-a84e-4fdc-94f1-feb3b93398d1" (UID: "afedada7-a84e-4fdc-94f1-feb3b93398d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:01:05 crc kubenswrapper[4727]: I1001 13:01:05.396223 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afedada7-a84e-4fdc-94f1-feb3b93398d1-config-data" (OuterVolumeSpecName: "config-data") pod "afedada7-a84e-4fdc-94f1-feb3b93398d1" (UID: "afedada7-a84e-4fdc-94f1-feb3b93398d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:01:05 crc kubenswrapper[4727]: I1001 13:01:05.445041 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbzwd\" (UniqueName: \"kubernetes.io/projected/afedada7-a84e-4fdc-94f1-feb3b93398d1-kube-api-access-kbzwd\") on node \"crc\" DevicePath \"\"" Oct 01 13:01:05 crc kubenswrapper[4727]: I1001 13:01:05.445222 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afedada7-a84e-4fdc-94f1-feb3b93398d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:01:05 crc kubenswrapper[4727]: I1001 13:01:05.445280 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afedada7-a84e-4fdc-94f1-feb3b93398d1-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:01:05 crc kubenswrapper[4727]: I1001 13:01:05.445332 4727 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/afedada7-a84e-4fdc-94f1-feb3b93398d1-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:01:05 crc kubenswrapper[4727]: I1001 13:01:05.977758 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322061-hhrsh" event={"ID":"afedada7-a84e-4fdc-94f1-feb3b93398d1","Type":"ContainerDied","Data":"1929da01891c396e862a83da2e9a4080b5b29a51ba73b2e725579ad99b698feb"} Oct 01 13:01:05 crc kubenswrapper[4727]: I1001 13:01:05.978231 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1929da01891c396e862a83da2e9a4080b5b29a51ba73b2e725579ad99b698feb" Oct 01 13:01:05 crc kubenswrapper[4727]: I1001 13:01:05.977806 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322061-hhrsh" Oct 01 13:01:09 crc kubenswrapper[4727]: I1001 13:01:09.319441 4727 scope.go:117] "RemoveContainer" containerID="6a35062585bd1e7d212a0a0d46d459c5729a57996ba20acbb7c35cd6a5423340" Oct 01 13:01:09 crc kubenswrapper[4727]: I1001 13:01:09.355054 4727 scope.go:117] "RemoveContainer" containerID="288e4e144ffc39623154a66d7ef4537301d04c325ba1ea2092a67705821d4216" Oct 01 13:01:09 crc kubenswrapper[4727]: I1001 13:01:09.384453 4727 scope.go:117] "RemoveContainer" containerID="d5037bf16e4c31dd20f34fdf8ab22a65417119201f8e2b42b7162e9ef7f38435" Oct 01 13:01:09 crc kubenswrapper[4727]: I1001 13:01:09.411864 4727 scope.go:117] "RemoveContainer" containerID="bfcbba1c5358c39debd98ccd9449ffc0bafb5e46618c2d78f8dac485f86c7e9b" Oct 01 13:01:09 crc kubenswrapper[4727]: I1001 13:01:09.483340 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6hqwq" Oct 01 13:01:09 crc kubenswrapper[4727]: I1001 13:01:09.483417 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6hqwq" Oct 01 13:01:09 crc kubenswrapper[4727]: I1001 13:01:09.538056 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6hqwq" Oct 01 13:01:10 crc kubenswrapper[4727]: I1001 13:01:10.081695 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6hqwq" Oct 01 13:01:10 crc kubenswrapper[4727]: I1001 13:01:10.139957 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6hqwq"] Oct 01 13:01:12 crc kubenswrapper[4727]: I1001 13:01:12.046880 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6hqwq" podUID="5a2c105e-c653-4055-ab64-4c8847d8df08" containerName="registry-server" containerID="cri-o://7be8b97e00f73bab188aea795da0e3b2ad9a4f41adb7341f22cf3da72e508bf3" gracePeriod=2 Oct 01 13:01:12 crc kubenswrapper[4727]: I1001 13:01:12.543638 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6hqwq" Oct 01 13:01:12 crc kubenswrapper[4727]: I1001 13:01:12.594221 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a2c105e-c653-4055-ab64-4c8847d8df08-catalog-content\") pod \"5a2c105e-c653-4055-ab64-4c8847d8df08\" (UID: \"5a2c105e-c653-4055-ab64-4c8847d8df08\") " Oct 01 13:01:12 crc kubenswrapper[4727]: I1001 13:01:12.594357 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ss48\" (UniqueName: \"kubernetes.io/projected/5a2c105e-c653-4055-ab64-4c8847d8df08-kube-api-access-8ss48\") pod \"5a2c105e-c653-4055-ab64-4c8847d8df08\" (UID: \"5a2c105e-c653-4055-ab64-4c8847d8df08\") " Oct 01 13:01:12 crc kubenswrapper[4727]: I1001 13:01:12.594514 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a2c105e-c653-4055-ab64-4c8847d8df08-utilities\") pod \"5a2c105e-c653-4055-ab64-4c8847d8df08\" (UID: \"5a2c105e-c653-4055-ab64-4c8847d8df08\") " Oct 01 13:01:12 crc kubenswrapper[4727]: I1001 13:01:12.595406 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a2c105e-c653-4055-ab64-4c8847d8df08-utilities" (OuterVolumeSpecName: "utilities") pod "5a2c105e-c653-4055-ab64-4c8847d8df08" (UID: "5a2c105e-c653-4055-ab64-4c8847d8df08"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:01:12 crc kubenswrapper[4727]: I1001 13:01:12.605460 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a2c105e-c653-4055-ab64-4c8847d8df08-kube-api-access-8ss48" (OuterVolumeSpecName: "kube-api-access-8ss48") pod "5a2c105e-c653-4055-ab64-4c8847d8df08" (UID: "5a2c105e-c653-4055-ab64-4c8847d8df08"). InnerVolumeSpecName "kube-api-access-8ss48". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:01:12 crc kubenswrapper[4727]: I1001 13:01:12.608636 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a2c105e-c653-4055-ab64-4c8847d8df08-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a2c105e-c653-4055-ab64-4c8847d8df08" (UID: "5a2c105e-c653-4055-ab64-4c8847d8df08"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:01:12 crc kubenswrapper[4727]: I1001 13:01:12.697532 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a2c105e-c653-4055-ab64-4c8847d8df08-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:01:12 crc kubenswrapper[4727]: I1001 13:01:12.697606 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ss48\" (UniqueName: \"kubernetes.io/projected/5a2c105e-c653-4055-ab64-4c8847d8df08-kube-api-access-8ss48\") on node \"crc\" DevicePath \"\"" Oct 01 13:01:12 crc kubenswrapper[4727]: I1001 13:01:12.697621 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a2c105e-c653-4055-ab64-4c8847d8df08-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:01:13 crc kubenswrapper[4727]: I1001 13:01:13.057485 4727 generic.go:334] "Generic (PLEG): container finished" podID="5a2c105e-c653-4055-ab64-4c8847d8df08" containerID="7be8b97e00f73bab188aea795da0e3b2ad9a4f41adb7341f22cf3da72e508bf3" exitCode=0 Oct 01 13:01:13 crc kubenswrapper[4727]: I1001 13:01:13.057537 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6hqwq" event={"ID":"5a2c105e-c653-4055-ab64-4c8847d8df08","Type":"ContainerDied","Data":"7be8b97e00f73bab188aea795da0e3b2ad9a4f41adb7341f22cf3da72e508bf3"} Oct 01 13:01:13 crc kubenswrapper[4727]: I1001 13:01:13.057590 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6hqwq" event={"ID":"5a2c105e-c653-4055-ab64-4c8847d8df08","Type":"ContainerDied","Data":"0bed27b3e32a4ac8f37ad002cba2681c820d45b257a71614b4ae2aa6868d1fc0"} Oct 01 13:01:13 crc kubenswrapper[4727]: I1001 13:01:13.057594 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6hqwq" Oct 01 13:01:13 crc kubenswrapper[4727]: I1001 13:01:13.057611 4727 scope.go:117] "RemoveContainer" containerID="7be8b97e00f73bab188aea795da0e3b2ad9a4f41adb7341f22cf3da72e508bf3" Oct 01 13:01:13 crc kubenswrapper[4727]: I1001 13:01:13.089067 4727 scope.go:117] "RemoveContainer" containerID="4d0dd7dfdd2d15f6fd4f4211f0bee4d6fc82530e9d6e2a98a71171869fbe975f" Oct 01 13:01:13 crc kubenswrapper[4727]: I1001 13:01:13.099536 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6hqwq"] Oct 01 13:01:13 crc kubenswrapper[4727]: I1001 13:01:13.109886 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6hqwq"] Oct 01 13:01:13 crc kubenswrapper[4727]: I1001 13:01:13.118392 4727 scope.go:117] "RemoveContainer" containerID="94aa085d26b868d82b74ddad596719d8ced99bb82b963573f30119c1a56cb002" Oct 01 13:01:13 crc kubenswrapper[4727]: I1001 13:01:13.179300 4727 scope.go:117] "RemoveContainer" containerID="7be8b97e00f73bab188aea795da0e3b2ad9a4f41adb7341f22cf3da72e508bf3" Oct 01 13:01:13 crc kubenswrapper[4727]: E1001 13:01:13.179713 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7be8b97e00f73bab188aea795da0e3b2ad9a4f41adb7341f22cf3da72e508bf3\": container with ID starting with 7be8b97e00f73bab188aea795da0e3b2ad9a4f41adb7341f22cf3da72e508bf3 not found: ID does not exist" containerID="7be8b97e00f73bab188aea795da0e3b2ad9a4f41adb7341f22cf3da72e508bf3" Oct 01 13:01:13 crc kubenswrapper[4727]: I1001 13:01:13.179766 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7be8b97e00f73bab188aea795da0e3b2ad9a4f41adb7341f22cf3da72e508bf3"} err="failed to get container status \"7be8b97e00f73bab188aea795da0e3b2ad9a4f41adb7341f22cf3da72e508bf3\": rpc error: code = NotFound desc = could not find container \"7be8b97e00f73bab188aea795da0e3b2ad9a4f41adb7341f22cf3da72e508bf3\": container with ID starting with 7be8b97e00f73bab188aea795da0e3b2ad9a4f41adb7341f22cf3da72e508bf3 not found: ID does not exist" Oct 01 13:01:13 crc kubenswrapper[4727]: I1001 13:01:13.179796 4727 scope.go:117] "RemoveContainer" containerID="4d0dd7dfdd2d15f6fd4f4211f0bee4d6fc82530e9d6e2a98a71171869fbe975f" Oct 01 13:01:13 crc kubenswrapper[4727]: E1001 13:01:13.180135 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d0dd7dfdd2d15f6fd4f4211f0bee4d6fc82530e9d6e2a98a71171869fbe975f\": container with ID starting with 4d0dd7dfdd2d15f6fd4f4211f0bee4d6fc82530e9d6e2a98a71171869fbe975f not found: ID does not exist" containerID="4d0dd7dfdd2d15f6fd4f4211f0bee4d6fc82530e9d6e2a98a71171869fbe975f" Oct 01 13:01:13 crc kubenswrapper[4727]: I1001 13:01:13.180170 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d0dd7dfdd2d15f6fd4f4211f0bee4d6fc82530e9d6e2a98a71171869fbe975f"} err="failed to get container status \"4d0dd7dfdd2d15f6fd4f4211f0bee4d6fc82530e9d6e2a98a71171869fbe975f\": rpc error: code = NotFound desc = could not find container \"4d0dd7dfdd2d15f6fd4f4211f0bee4d6fc82530e9d6e2a98a71171869fbe975f\": container with ID starting with 4d0dd7dfdd2d15f6fd4f4211f0bee4d6fc82530e9d6e2a98a71171869fbe975f not found: ID does not exist" Oct 01 13:01:13 crc kubenswrapper[4727]: I1001 13:01:13.180188 4727 scope.go:117] "RemoveContainer" containerID="94aa085d26b868d82b74ddad596719d8ced99bb82b963573f30119c1a56cb002" Oct 01 13:01:13 crc kubenswrapper[4727]: E1001 13:01:13.180560 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94aa085d26b868d82b74ddad596719d8ced99bb82b963573f30119c1a56cb002\": container with ID starting with 94aa085d26b868d82b74ddad596719d8ced99bb82b963573f30119c1a56cb002 not found: ID does not exist" containerID="94aa085d26b868d82b74ddad596719d8ced99bb82b963573f30119c1a56cb002" Oct 01 13:01:13 crc kubenswrapper[4727]: I1001 13:01:13.180588 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94aa085d26b868d82b74ddad596719d8ced99bb82b963573f30119c1a56cb002"} err="failed to get container status \"94aa085d26b868d82b74ddad596719d8ced99bb82b963573f30119c1a56cb002\": rpc error: code = NotFound desc = could not find container \"94aa085d26b868d82b74ddad596719d8ced99bb82b963573f30119c1a56cb002\": container with ID starting with 94aa085d26b868d82b74ddad596719d8ced99bb82b963573f30119c1a56cb002 not found: ID does not exist" Oct 01 13:01:14 crc kubenswrapper[4727]: I1001 13:01:14.384350 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a2c105e-c653-4055-ab64-4c8847d8df08" path="/var/lib/kubelet/pods/5a2c105e-c653-4055-ab64-4c8847d8df08/volumes" Oct 01 13:01:40 crc kubenswrapper[4727]: I1001 13:01:40.227470 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k7cvr"] Oct 01 13:01:40 crc kubenswrapper[4727]: E1001 13:01:40.230443 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a2c105e-c653-4055-ab64-4c8847d8df08" containerName="registry-server" Oct 01 13:01:40 crc kubenswrapper[4727]: I1001 13:01:40.230562 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a2c105e-c653-4055-ab64-4c8847d8df08" containerName="registry-server" Oct 01 13:01:40 crc kubenswrapper[4727]: E1001 13:01:40.230636 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a2c105e-c653-4055-ab64-4c8847d8df08" containerName="extract-content" Oct 01 13:01:40 crc kubenswrapper[4727]: I1001 13:01:40.230693 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a2c105e-c653-4055-ab64-4c8847d8df08" containerName="extract-content" Oct 01 13:01:40 crc kubenswrapper[4727]: E1001 13:01:40.230763 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a2c105e-c653-4055-ab64-4c8847d8df08" containerName="extract-utilities" Oct 01 13:01:40 crc kubenswrapper[4727]: I1001 13:01:40.230829 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a2c105e-c653-4055-ab64-4c8847d8df08" containerName="extract-utilities" Oct 01 13:01:40 crc kubenswrapper[4727]: E1001 13:01:40.230893 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afedada7-a84e-4fdc-94f1-feb3b93398d1" containerName="keystone-cron" Oct 01 13:01:40 crc kubenswrapper[4727]: I1001 13:01:40.230945 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="afedada7-a84e-4fdc-94f1-feb3b93398d1" containerName="keystone-cron" Oct 01 13:01:40 crc kubenswrapper[4727]: I1001 13:01:40.231234 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a2c105e-c653-4055-ab64-4c8847d8df08" containerName="registry-server" Oct 01 13:01:40 crc kubenswrapper[4727]: I1001 13:01:40.231311 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="afedada7-a84e-4fdc-94f1-feb3b93398d1" containerName="keystone-cron" Oct 01 13:01:40 crc kubenswrapper[4727]: I1001 13:01:40.233288 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k7cvr" Oct 01 13:01:40 crc kubenswrapper[4727]: I1001 13:01:40.246659 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k7cvr"] Oct 01 13:01:40 crc kubenswrapper[4727]: I1001 13:01:40.329749 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44d99ed4-f0f6-4597-a224-941f817df121-catalog-content\") pod \"redhat-operators-k7cvr\" (UID: \"44d99ed4-f0f6-4597-a224-941f817df121\") " pod="openshift-marketplace/redhat-operators-k7cvr" Oct 01 13:01:40 crc kubenswrapper[4727]: I1001 13:01:40.330130 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbwzk\" (UniqueName: \"kubernetes.io/projected/44d99ed4-f0f6-4597-a224-941f817df121-kube-api-access-jbwzk\") pod \"redhat-operators-k7cvr\" (UID: \"44d99ed4-f0f6-4597-a224-941f817df121\") " pod="openshift-marketplace/redhat-operators-k7cvr" Oct 01 13:01:40 crc kubenswrapper[4727]: I1001 13:01:40.330303 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44d99ed4-f0f6-4597-a224-941f817df121-utilities\") pod \"redhat-operators-k7cvr\" (UID: \"44d99ed4-f0f6-4597-a224-941f817df121\") " pod="openshift-marketplace/redhat-operators-k7cvr" Oct 01 13:01:40 crc kubenswrapper[4727]: I1001 13:01:40.432333 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44d99ed4-f0f6-4597-a224-941f817df121-catalog-content\") pod \"redhat-operators-k7cvr\" (UID: \"44d99ed4-f0f6-4597-a224-941f817df121\") " pod="openshift-marketplace/redhat-operators-k7cvr" Oct 01 13:01:40 crc kubenswrapper[4727]: I1001 13:01:40.432703 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbwzk\" (UniqueName: \"kubernetes.io/projected/44d99ed4-f0f6-4597-a224-941f817df121-kube-api-access-jbwzk\") pod \"redhat-operators-k7cvr\" (UID: \"44d99ed4-f0f6-4597-a224-941f817df121\") " pod="openshift-marketplace/redhat-operators-k7cvr" Oct 01 13:01:40 crc kubenswrapper[4727]: I1001 13:01:40.432862 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44d99ed4-f0f6-4597-a224-941f817df121-utilities\") pod \"redhat-operators-k7cvr\" (UID: \"44d99ed4-f0f6-4597-a224-941f817df121\") " pod="openshift-marketplace/redhat-operators-k7cvr" Oct 01 13:01:40 crc kubenswrapper[4727]: I1001 13:01:40.432891 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44d99ed4-f0f6-4597-a224-941f817df121-catalog-content\") pod \"redhat-operators-k7cvr\" (UID: \"44d99ed4-f0f6-4597-a224-941f817df121\") " pod="openshift-marketplace/redhat-operators-k7cvr" Oct 01 13:01:40 crc kubenswrapper[4727]: I1001 13:01:40.433327 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44d99ed4-f0f6-4597-a224-941f817df121-utilities\") pod \"redhat-operators-k7cvr\" (UID: \"44d99ed4-f0f6-4597-a224-941f817df121\") " pod="openshift-marketplace/redhat-operators-k7cvr" Oct 01 13:01:40 crc kubenswrapper[4727]: I1001 13:01:40.465037 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbwzk\" (UniqueName: \"kubernetes.io/projected/44d99ed4-f0f6-4597-a224-941f817df121-kube-api-access-jbwzk\") pod \"redhat-operators-k7cvr\" (UID: \"44d99ed4-f0f6-4597-a224-941f817df121\") " pod="openshift-marketplace/redhat-operators-k7cvr" Oct 01 13:01:40 crc kubenswrapper[4727]: I1001 13:01:40.557520 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k7cvr" Oct 01 13:01:41 crc kubenswrapper[4727]: I1001 13:01:41.014303 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k7cvr"] Oct 01 13:01:41 crc kubenswrapper[4727]: I1001 13:01:41.332581 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k7cvr" event={"ID":"44d99ed4-f0f6-4597-a224-941f817df121","Type":"ContainerStarted","Data":"6777c4e64b4754b7a70a9bbad5511c1d3b4322bb31854f71ea9033d7c3835c3d"} Oct 01 13:01:42 crc kubenswrapper[4727]: I1001 13:01:42.343784 4727 generic.go:334] "Generic (PLEG): container finished" podID="44d99ed4-f0f6-4597-a224-941f817df121" containerID="6f51c83a0f04359e345f1823712e8c803dd892450eafdcff283bc29e9499815f" exitCode=0 Oct 01 13:01:42 crc kubenswrapper[4727]: I1001 13:01:42.343888 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k7cvr" event={"ID":"44d99ed4-f0f6-4597-a224-941f817df121","Type":"ContainerDied","Data":"6f51c83a0f04359e345f1823712e8c803dd892450eafdcff283bc29e9499815f"} Oct 01 13:01:42 crc kubenswrapper[4727]: I1001 13:01:42.347859 4727 generic.go:334] "Generic (PLEG): container finished" podID="2f2522c5-4bf3-4d82-af9a-546abdb6c4be" containerID="30ef675a8d7849d85d0ff2aa6e06a8dd8b7a65ac448ed727f82747a27d041ff4" exitCode=0 Oct 01 13:01:42 crc kubenswrapper[4727]: I1001 13:01:42.347903 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx" event={"ID":"2f2522c5-4bf3-4d82-af9a-546abdb6c4be","Type":"ContainerDied","Data":"30ef675a8d7849d85d0ff2aa6e06a8dd8b7a65ac448ed727f82747a27d041ff4"} Oct 01 13:01:43 crc kubenswrapper[4727]: I1001 13:01:43.756544 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx" Oct 01 13:01:43 crc kubenswrapper[4727]: I1001 13:01:43.794797 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2f2522c5-4bf3-4d82-af9a-546abdb6c4be-ssh-key\") pod \"2f2522c5-4bf3-4d82-af9a-546abdb6c4be\" (UID: \"2f2522c5-4bf3-4d82-af9a-546abdb6c4be\") " Oct 01 13:01:43 crc kubenswrapper[4727]: I1001 13:01:43.794863 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f2522c5-4bf3-4d82-af9a-546abdb6c4be-bootstrap-combined-ca-bundle\") pod \"2f2522c5-4bf3-4d82-af9a-546abdb6c4be\" (UID: \"2f2522c5-4bf3-4d82-af9a-546abdb6c4be\") " Oct 01 13:01:43 crc kubenswrapper[4727]: I1001 13:01:43.794890 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f2522c5-4bf3-4d82-af9a-546abdb6c4be-inventory\") pod \"2f2522c5-4bf3-4d82-af9a-546abdb6c4be\" (UID: \"2f2522c5-4bf3-4d82-af9a-546abdb6c4be\") " Oct 01 13:01:43 crc kubenswrapper[4727]: I1001 13:01:43.794959 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxhhr\" (UniqueName: \"kubernetes.io/projected/2f2522c5-4bf3-4d82-af9a-546abdb6c4be-kube-api-access-kxhhr\") pod \"2f2522c5-4bf3-4d82-af9a-546abdb6c4be\" (UID: \"2f2522c5-4bf3-4d82-af9a-546abdb6c4be\") " Oct 01 13:01:43 crc kubenswrapper[4727]: I1001 13:01:43.802821 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f2522c5-4bf3-4d82-af9a-546abdb6c4be-kube-api-access-kxhhr" (OuterVolumeSpecName: "kube-api-access-kxhhr") pod "2f2522c5-4bf3-4d82-af9a-546abdb6c4be" (UID: "2f2522c5-4bf3-4d82-af9a-546abdb6c4be"). InnerVolumeSpecName "kube-api-access-kxhhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:01:43 crc kubenswrapper[4727]: I1001 13:01:43.819026 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f2522c5-4bf3-4d82-af9a-546abdb6c4be-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "2f2522c5-4bf3-4d82-af9a-546abdb6c4be" (UID: "2f2522c5-4bf3-4d82-af9a-546abdb6c4be"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:01:43 crc kubenswrapper[4727]: I1001 13:01:43.827553 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f2522c5-4bf3-4d82-af9a-546abdb6c4be-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2f2522c5-4bf3-4d82-af9a-546abdb6c4be" (UID: "2f2522c5-4bf3-4d82-af9a-546abdb6c4be"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:01:43 crc kubenswrapper[4727]: I1001 13:01:43.828527 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f2522c5-4bf3-4d82-af9a-546abdb6c4be-inventory" (OuterVolumeSpecName: "inventory") pod "2f2522c5-4bf3-4d82-af9a-546abdb6c4be" (UID: "2f2522c5-4bf3-4d82-af9a-546abdb6c4be"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:01:43 crc kubenswrapper[4727]: I1001 13:01:43.896686 4727 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2f2522c5-4bf3-4d82-af9a-546abdb6c4be-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:01:43 crc kubenswrapper[4727]: I1001 13:01:43.896728 4727 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f2522c5-4bf3-4d82-af9a-546abdb6c4be-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:01:43 crc kubenswrapper[4727]: I1001 13:01:43.896743 4727 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f2522c5-4bf3-4d82-af9a-546abdb6c4be-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:01:43 crc kubenswrapper[4727]: I1001 13:01:43.896755 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxhhr\" (UniqueName: \"kubernetes.io/projected/2f2522c5-4bf3-4d82-af9a-546abdb6c4be-kube-api-access-kxhhr\") on node \"crc\" DevicePath \"\"" Oct 01 13:01:44 crc kubenswrapper[4727]: I1001 13:01:44.367691 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx" event={"ID":"2f2522c5-4bf3-4d82-af9a-546abdb6c4be","Type":"ContainerDied","Data":"f57de7ca028f9961eff7ee4703e5b61e91ad814ac42a666e092b8d6f9f282e9d"} Oct 01 13:01:44 crc kubenswrapper[4727]: I1001 13:01:44.367738 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f57de7ca028f9961eff7ee4703e5b61e91ad814ac42a666e092b8d6f9f282e9d" Oct 01 13:01:44 crc kubenswrapper[4727]: I1001 13:01:44.367792 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx" Oct 01 13:01:44 crc kubenswrapper[4727]: I1001 13:01:44.476339 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj"] Oct 01 13:01:44 crc kubenswrapper[4727]: E1001 13:01:44.476864 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f2522c5-4bf3-4d82-af9a-546abdb6c4be" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 13:01:44 crc kubenswrapper[4727]: I1001 13:01:44.476889 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f2522c5-4bf3-4d82-af9a-546abdb6c4be" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 13:01:44 crc kubenswrapper[4727]: I1001 13:01:44.477178 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f2522c5-4bf3-4d82-af9a-546abdb6c4be" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 13:01:44 crc kubenswrapper[4727]: I1001 13:01:44.477944 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj" Oct 01 13:01:44 crc kubenswrapper[4727]: I1001 13:01:44.482310 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:01:44 crc kubenswrapper[4727]: I1001 13:01:44.482385 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:01:44 crc kubenswrapper[4727]: I1001 13:01:44.482488 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:01:44 crc kubenswrapper[4727]: I1001 13:01:44.482835 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jcjb6" Oct 01 13:01:44 crc kubenswrapper[4727]: I1001 13:01:44.489161 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj"] Oct 01 13:01:44 crc kubenswrapper[4727]: I1001 13:01:44.511417 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f819364-69c9-47d2-9876-82a3081ab579-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj\" (UID: \"0f819364-69c9-47d2-9876-82a3081ab579\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj" Oct 01 13:01:44 crc kubenswrapper[4727]: I1001 13:01:44.511513 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szj9g\" (UniqueName: \"kubernetes.io/projected/0f819364-69c9-47d2-9876-82a3081ab579-kube-api-access-szj9g\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj\" (UID: \"0f819364-69c9-47d2-9876-82a3081ab579\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj" Oct 01 13:01:44 crc kubenswrapper[4727]: I1001 13:01:44.511636 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f819364-69c9-47d2-9876-82a3081ab579-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj\" (UID: \"0f819364-69c9-47d2-9876-82a3081ab579\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj" Oct 01 13:01:44 crc kubenswrapper[4727]: I1001 13:01:44.612944 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f819364-69c9-47d2-9876-82a3081ab579-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj\" (UID: \"0f819364-69c9-47d2-9876-82a3081ab579\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj" Oct 01 13:01:44 crc kubenswrapper[4727]: I1001 13:01:44.613099 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f819364-69c9-47d2-9876-82a3081ab579-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj\" (UID: \"0f819364-69c9-47d2-9876-82a3081ab579\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj" Oct 01 13:01:44 crc kubenswrapper[4727]: I1001 13:01:44.613201 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szj9g\" (UniqueName: \"kubernetes.io/projected/0f819364-69c9-47d2-9876-82a3081ab579-kube-api-access-szj9g\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj\" (UID: \"0f819364-69c9-47d2-9876-82a3081ab579\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj" Oct 01 13:01:44 crc kubenswrapper[4727]: I1001 13:01:44.617325 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f819364-69c9-47d2-9876-82a3081ab579-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj\" (UID: \"0f819364-69c9-47d2-9876-82a3081ab579\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj" Oct 01 13:01:44 crc kubenswrapper[4727]: I1001 13:01:44.629170 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f819364-69c9-47d2-9876-82a3081ab579-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj\" (UID: \"0f819364-69c9-47d2-9876-82a3081ab579\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj" Oct 01 13:01:44 crc kubenswrapper[4727]: I1001 13:01:44.630850 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szj9g\" (UniqueName: \"kubernetes.io/projected/0f819364-69c9-47d2-9876-82a3081ab579-kube-api-access-szj9g\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj\" (UID: \"0f819364-69c9-47d2-9876-82a3081ab579\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj" Oct 01 13:01:44 crc kubenswrapper[4727]: I1001 13:01:44.804352 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj" Oct 01 13:01:56 crc kubenswrapper[4727]: I1001 13:01:56.074590 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tx89f"] Oct 01 13:01:56 crc kubenswrapper[4727]: I1001 13:01:56.078814 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tx89f" Oct 01 13:01:56 crc kubenswrapper[4727]: I1001 13:01:56.087187 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tx89f"] Oct 01 13:01:56 crc kubenswrapper[4727]: I1001 13:01:56.140630 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8257cd0e-01d2-4769-8f26-d27de521ece3-utilities\") pod \"community-operators-tx89f\" (UID: \"8257cd0e-01d2-4769-8f26-d27de521ece3\") " pod="openshift-marketplace/community-operators-tx89f" Oct 01 13:01:56 crc kubenswrapper[4727]: I1001 13:01:56.141078 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8257cd0e-01d2-4769-8f26-d27de521ece3-catalog-content\") pod \"community-operators-tx89f\" (UID: \"8257cd0e-01d2-4769-8f26-d27de521ece3\") " pod="openshift-marketplace/community-operators-tx89f" Oct 01 13:01:56 crc kubenswrapper[4727]: I1001 13:01:56.141160 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppzsm\" (UniqueName: \"kubernetes.io/projected/8257cd0e-01d2-4769-8f26-d27de521ece3-kube-api-access-ppzsm\") pod \"community-operators-tx89f\" (UID: \"8257cd0e-01d2-4769-8f26-d27de521ece3\") " pod="openshift-marketplace/community-operators-tx89f" Oct 01 13:01:56 crc kubenswrapper[4727]: I1001 13:01:56.242992 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8257cd0e-01d2-4769-8f26-d27de521ece3-catalog-content\") pod \"community-operators-tx89f\" (UID: \"8257cd0e-01d2-4769-8f26-d27de521ece3\") " pod="openshift-marketplace/community-operators-tx89f" Oct 01 13:01:56 crc kubenswrapper[4727]: I1001 13:01:56.243109 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppzsm\" (UniqueName: \"kubernetes.io/projected/8257cd0e-01d2-4769-8f26-d27de521ece3-kube-api-access-ppzsm\") pod \"community-operators-tx89f\" (UID: \"8257cd0e-01d2-4769-8f26-d27de521ece3\") " pod="openshift-marketplace/community-operators-tx89f" Oct 01 13:01:56 crc kubenswrapper[4727]: I1001 13:01:56.243157 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8257cd0e-01d2-4769-8f26-d27de521ece3-utilities\") pod \"community-operators-tx89f\" (UID: \"8257cd0e-01d2-4769-8f26-d27de521ece3\") " pod="openshift-marketplace/community-operators-tx89f" Oct 01 13:01:56 crc kubenswrapper[4727]: I1001 13:01:56.243510 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8257cd0e-01d2-4769-8f26-d27de521ece3-catalog-content\") pod \"community-operators-tx89f\" (UID: \"8257cd0e-01d2-4769-8f26-d27de521ece3\") " pod="openshift-marketplace/community-operators-tx89f" Oct 01 13:01:56 crc kubenswrapper[4727]: I1001 13:01:56.243619 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8257cd0e-01d2-4769-8f26-d27de521ece3-utilities\") pod \"community-operators-tx89f\" (UID: \"8257cd0e-01d2-4769-8f26-d27de521ece3\") " pod="openshift-marketplace/community-operators-tx89f" Oct 01 13:01:56 crc kubenswrapper[4727]: I1001 13:01:56.272379 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppzsm\" (UniqueName: \"kubernetes.io/projected/8257cd0e-01d2-4769-8f26-d27de521ece3-kube-api-access-ppzsm\") pod \"community-operators-tx89f\" (UID: \"8257cd0e-01d2-4769-8f26-d27de521ece3\") " pod="openshift-marketplace/community-operators-tx89f" Oct 01 13:01:56 crc kubenswrapper[4727]: I1001 13:01:56.398618 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tx89f" Oct 01 13:02:01 crc kubenswrapper[4727]: I1001 13:02:01.511712 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj"] Oct 01 13:02:03 crc kubenswrapper[4727]: I1001 13:02:03.292413 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:02:03 crc kubenswrapper[4727]: I1001 13:02:03.292944 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:02:08 crc kubenswrapper[4727]: I1001 13:02:08.570204 4727 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:02:08 crc kubenswrapper[4727]: I1001 13:02:08.615533 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj" event={"ID":"0f819364-69c9-47d2-9876-82a3081ab579","Type":"ContainerStarted","Data":"acf64d658dcfd13de416191266dfa1b49a613bb9119390bb34632ffc793bdda3"} Oct 01 13:02:08 crc kubenswrapper[4727]: E1001 13:02:08.779149 4727 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 01 13:02:08 crc kubenswrapper[4727]: E1001 13:02:08.779548 4727 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jbwzk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-k7cvr_openshift-marketplace(44d99ed4-f0f6-4597-a224-941f817df121): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 13:02:08 crc kubenswrapper[4727]: E1001 13:02:08.784198 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-k7cvr" podUID="44d99ed4-f0f6-4597-a224-941f817df121" Oct 01 13:02:09 crc kubenswrapper[4727]: I1001 13:02:09.009310 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tx89f"] Oct 01 13:02:09 crc kubenswrapper[4727]: W1001 13:02:09.013757 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8257cd0e_01d2_4769_8f26_d27de521ece3.slice/crio-572ea83ee2726f6e152f90d5836de4ceea11345fee3d86886b77e03e1e8865c7 WatchSource:0}: Error finding container 572ea83ee2726f6e152f90d5836de4ceea11345fee3d86886b77e03e1e8865c7: Status 404 returned error can't find the container with id 572ea83ee2726f6e152f90d5836de4ceea11345fee3d86886b77e03e1e8865c7 Oct 01 13:02:09 crc kubenswrapper[4727]: I1001 13:02:09.509495 4727 scope.go:117] "RemoveContainer" containerID="cfc566c49464a3065c000e25c972aab21cfb832ca7ca1ca7e047a4d1d0511c3f" Oct 01 13:02:09 crc kubenswrapper[4727]: I1001 13:02:09.546291 4727 scope.go:117] "RemoveContainer" containerID="dedb8d305ee13ccb866e1f6882d20d74107dd9c5fdcf8048d754c24185f844ed" Oct 01 13:02:09 crc kubenswrapper[4727]: I1001 13:02:09.627487 4727 generic.go:334] "Generic (PLEG): container finished" podID="8257cd0e-01d2-4769-8f26-d27de521ece3" containerID="3ba57423bb4f0d6b2da2c9479f4e92eb6d94fb78c2bbe163e1c0fa0e9b2b7e5a" exitCode=0 Oct 01 13:02:09 crc kubenswrapper[4727]: I1001 13:02:09.627574 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tx89f" event={"ID":"8257cd0e-01d2-4769-8f26-d27de521ece3","Type":"ContainerDied","Data":"3ba57423bb4f0d6b2da2c9479f4e92eb6d94fb78c2bbe163e1c0fa0e9b2b7e5a"} Oct 01 13:02:09 crc kubenswrapper[4727]: I1001 13:02:09.627605 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tx89f" event={"ID":"8257cd0e-01d2-4769-8f26-d27de521ece3","Type":"ContainerStarted","Data":"572ea83ee2726f6e152f90d5836de4ceea11345fee3d86886b77e03e1e8865c7"} Oct 01 13:02:09 crc kubenswrapper[4727]: E1001 13:02:09.661879 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-k7cvr" podUID="44d99ed4-f0f6-4597-a224-941f817df121" Oct 01 13:02:09 crc kubenswrapper[4727]: I1001 13:02:09.871320 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:02:10 crc kubenswrapper[4727]: I1001 13:02:10.654425 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj" event={"ID":"0f819364-69c9-47d2-9876-82a3081ab579","Type":"ContainerStarted","Data":"5cc2f5b3bda6f94e9c2977637d751f200cb214c5cafc06707652e85d57f99366"} Oct 01 13:02:10 crc kubenswrapper[4727]: I1001 13:02:10.678972 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj" podStartSLOduration=25.380940272 podStartE2EDuration="26.678948841s" podCreationTimestamp="2025-10-01 13:01:44 +0000 UTC" firstStartedPulling="2025-10-01 13:02:08.56989279 +0000 UTC m=+1506.891247627" lastFinishedPulling="2025-10-01 13:02:09.867901359 +0000 UTC m=+1508.189256196" observedRunningTime="2025-10-01 13:02:10.67351787 +0000 UTC m=+1508.994872727" watchObservedRunningTime="2025-10-01 13:02:10.678948841 +0000 UTC m=+1509.000303688" Oct 01 13:02:11 crc kubenswrapper[4727]: I1001 13:02:11.666199 4727 generic.go:334] "Generic (PLEG): container finished" podID="8257cd0e-01d2-4769-8f26-d27de521ece3" containerID="685f47256581998be9ed07fc1df6c3407a43fec352be147fc67f87824c3c0dbf" exitCode=0 Oct 01 13:02:11 crc kubenswrapper[4727]: I1001 13:02:11.666280 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tx89f" event={"ID":"8257cd0e-01d2-4769-8f26-d27de521ece3","Type":"ContainerDied","Data":"685f47256581998be9ed07fc1df6c3407a43fec352be147fc67f87824c3c0dbf"} Oct 01 13:02:12 crc kubenswrapper[4727]: I1001 13:02:12.677274 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tx89f" event={"ID":"8257cd0e-01d2-4769-8f26-d27de521ece3","Type":"ContainerStarted","Data":"0d7cab30451b60576b44a1ba34365114e55c49d95d99c38d84343837f334b346"} Oct 01 13:02:12 crc kubenswrapper[4727]: I1001 13:02:12.697060 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tx89f" podStartSLOduration=14.058701326 podStartE2EDuration="16.69704184s" podCreationTimestamp="2025-10-01 13:01:56 +0000 UTC" firstStartedPulling="2025-10-01 13:02:09.661730329 +0000 UTC m=+1507.983085166" lastFinishedPulling="2025-10-01 13:02:12.300070843 +0000 UTC m=+1510.621425680" observedRunningTime="2025-10-01 13:02:12.69542145 +0000 UTC m=+1511.016776307" watchObservedRunningTime="2025-10-01 13:02:12.69704184 +0000 UTC m=+1511.018396677" Oct 01 13:02:16 crc kubenswrapper[4727]: I1001 13:02:16.399162 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tx89f" Oct 01 13:02:16 crc kubenswrapper[4727]: I1001 13:02:16.400041 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tx89f" Oct 01 13:02:16 crc kubenswrapper[4727]: I1001 13:02:16.445050 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tx89f" Oct 01 13:02:24 crc kubenswrapper[4727]: I1001 13:02:24.037440 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-drxkp"] Oct 01 13:02:24 crc kubenswrapper[4727]: I1001 13:02:24.047279 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-5zh6t"] Oct 01 13:02:24 crc kubenswrapper[4727]: I1001 13:02:24.059405 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-5zh6t"] Oct 01 13:02:24 crc kubenswrapper[4727]: I1001 13:02:24.067513 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-drxkp"] Oct 01 13:02:24 crc kubenswrapper[4727]: I1001 13:02:24.393455 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c449afe5-f791-4d64-9d5d-f3222f7a9f40" path="/var/lib/kubelet/pods/c449afe5-f791-4d64-9d5d-f3222f7a9f40/volumes" Oct 01 13:02:24 crc kubenswrapper[4727]: I1001 13:02:24.394892 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fee065f1-f9a2-43bd-ae70-94d196555b5f" path="/var/lib/kubelet/pods/fee065f1-f9a2-43bd-ae70-94d196555b5f/volumes" Oct 01 13:02:25 crc kubenswrapper[4727]: I1001 13:02:25.031255 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-v4zp9"] Oct 01 13:02:25 crc kubenswrapper[4727]: I1001 13:02:25.039603 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-v4zp9"] Oct 01 13:02:25 crc kubenswrapper[4727]: I1001 13:02:25.807990 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k7cvr" event={"ID":"44d99ed4-f0f6-4597-a224-941f817df121","Type":"ContainerStarted","Data":"b0a644651c31005e85fb63fdc4bf7811c295fac6bec93aa47a2e90dc09314707"} Oct 01 13:02:26 crc kubenswrapper[4727]: I1001 13:02:26.384585 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d250eeab-d323-428b-95de-ff6d859ee48b" path="/var/lib/kubelet/pods/d250eeab-d323-428b-95de-ff6d859ee48b/volumes" Oct 01 13:02:26 crc kubenswrapper[4727]: I1001 13:02:26.449949 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tx89f" Oct 01 13:02:26 crc kubenswrapper[4727]: I1001 13:02:26.819689 4727 generic.go:334] "Generic (PLEG): container finished" podID="44d99ed4-f0f6-4597-a224-941f817df121" containerID="b0a644651c31005e85fb63fdc4bf7811c295fac6bec93aa47a2e90dc09314707" exitCode=0 Oct 01 13:02:26 crc kubenswrapper[4727]: I1001 13:02:26.819758 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k7cvr" event={"ID":"44d99ed4-f0f6-4597-a224-941f817df121","Type":"ContainerDied","Data":"b0a644651c31005e85fb63fdc4bf7811c295fac6bec93aa47a2e90dc09314707"} Oct 01 13:02:27 crc kubenswrapper[4727]: I1001 13:02:27.045769 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tx89f"] Oct 01 13:02:27 crc kubenswrapper[4727]: I1001 13:02:27.046324 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tx89f" podUID="8257cd0e-01d2-4769-8f26-d27de521ece3" containerName="registry-server" containerID="cri-o://0d7cab30451b60576b44a1ba34365114e55c49d95d99c38d84343837f334b346" gracePeriod=2 Oct 01 13:02:27 crc kubenswrapper[4727]: I1001 13:02:27.489826 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tx89f" Oct 01 13:02:27 crc kubenswrapper[4727]: I1001 13:02:27.663061 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8257cd0e-01d2-4769-8f26-d27de521ece3-catalog-content\") pod \"8257cd0e-01d2-4769-8f26-d27de521ece3\" (UID: \"8257cd0e-01d2-4769-8f26-d27de521ece3\") " Oct 01 13:02:27 crc kubenswrapper[4727]: I1001 13:02:27.663102 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppzsm\" (UniqueName: \"kubernetes.io/projected/8257cd0e-01d2-4769-8f26-d27de521ece3-kube-api-access-ppzsm\") pod \"8257cd0e-01d2-4769-8f26-d27de521ece3\" (UID: \"8257cd0e-01d2-4769-8f26-d27de521ece3\") " Oct 01 13:02:27 crc kubenswrapper[4727]: I1001 13:02:27.663207 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8257cd0e-01d2-4769-8f26-d27de521ece3-utilities\") pod \"8257cd0e-01d2-4769-8f26-d27de521ece3\" (UID: \"8257cd0e-01d2-4769-8f26-d27de521ece3\") " Oct 01 13:02:27 crc kubenswrapper[4727]: I1001 13:02:27.664553 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8257cd0e-01d2-4769-8f26-d27de521ece3-utilities" (OuterVolumeSpecName: "utilities") pod "8257cd0e-01d2-4769-8f26-d27de521ece3" (UID: "8257cd0e-01d2-4769-8f26-d27de521ece3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:02:27 crc kubenswrapper[4727]: I1001 13:02:27.667966 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8257cd0e-01d2-4769-8f26-d27de521ece3-kube-api-access-ppzsm" (OuterVolumeSpecName: "kube-api-access-ppzsm") pod "8257cd0e-01d2-4769-8f26-d27de521ece3" (UID: "8257cd0e-01d2-4769-8f26-d27de521ece3"). InnerVolumeSpecName "kube-api-access-ppzsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:02:27 crc kubenswrapper[4727]: I1001 13:02:27.738244 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8257cd0e-01d2-4769-8f26-d27de521ece3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8257cd0e-01d2-4769-8f26-d27de521ece3" (UID: "8257cd0e-01d2-4769-8f26-d27de521ece3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:02:27 crc kubenswrapper[4727]: I1001 13:02:27.765689 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8257cd0e-01d2-4769-8f26-d27de521ece3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:02:27 crc kubenswrapper[4727]: I1001 13:02:27.765734 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppzsm\" (UniqueName: \"kubernetes.io/projected/8257cd0e-01d2-4769-8f26-d27de521ece3-kube-api-access-ppzsm\") on node \"crc\" DevicePath \"\"" Oct 01 13:02:27 crc kubenswrapper[4727]: I1001 13:02:27.765746 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8257cd0e-01d2-4769-8f26-d27de521ece3-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:02:27 crc kubenswrapper[4727]: I1001 13:02:27.830954 4727 generic.go:334] "Generic (PLEG): container finished" podID="8257cd0e-01d2-4769-8f26-d27de521ece3" containerID="0d7cab30451b60576b44a1ba34365114e55c49d95d99c38d84343837f334b346" exitCode=0 Oct 01 13:02:27 crc kubenswrapper[4727]: I1001 13:02:27.831018 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tx89f" event={"ID":"8257cd0e-01d2-4769-8f26-d27de521ece3","Type":"ContainerDied","Data":"0d7cab30451b60576b44a1ba34365114e55c49d95d99c38d84343837f334b346"} Oct 01 13:02:27 crc kubenswrapper[4727]: I1001 13:02:27.831058 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tx89f" event={"ID":"8257cd0e-01d2-4769-8f26-d27de521ece3","Type":"ContainerDied","Data":"572ea83ee2726f6e152f90d5836de4ceea11345fee3d86886b77e03e1e8865c7"} Oct 01 13:02:27 crc kubenswrapper[4727]: I1001 13:02:27.831080 4727 scope.go:117] "RemoveContainer" containerID="0d7cab30451b60576b44a1ba34365114e55c49d95d99c38d84343837f334b346" Oct 01 13:02:27 crc kubenswrapper[4727]: I1001 13:02:27.831159 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tx89f" Oct 01 13:02:27 crc kubenswrapper[4727]: I1001 13:02:27.853358 4727 scope.go:117] "RemoveContainer" containerID="685f47256581998be9ed07fc1df6c3407a43fec352be147fc67f87824c3c0dbf" Oct 01 13:02:27 crc kubenswrapper[4727]: I1001 13:02:27.873268 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tx89f"] Oct 01 13:02:27 crc kubenswrapper[4727]: I1001 13:02:27.882808 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tx89f"] Oct 01 13:02:27 crc kubenswrapper[4727]: I1001 13:02:27.885421 4727 scope.go:117] "RemoveContainer" containerID="3ba57423bb4f0d6b2da2c9479f4e92eb6d94fb78c2bbe163e1c0fa0e9b2b7e5a" Oct 01 13:02:27 crc kubenswrapper[4727]: I1001 13:02:27.908758 4727 scope.go:117] "RemoveContainer" containerID="0d7cab30451b60576b44a1ba34365114e55c49d95d99c38d84343837f334b346" Oct 01 13:02:27 crc kubenswrapper[4727]: E1001 13:02:27.909367 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d7cab30451b60576b44a1ba34365114e55c49d95d99c38d84343837f334b346\": container with ID starting with 0d7cab30451b60576b44a1ba34365114e55c49d95d99c38d84343837f334b346 not found: ID does not exist" containerID="0d7cab30451b60576b44a1ba34365114e55c49d95d99c38d84343837f334b346" Oct 01 13:02:27 crc kubenswrapper[4727]: I1001 13:02:27.909404 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d7cab30451b60576b44a1ba34365114e55c49d95d99c38d84343837f334b346"} err="failed to get container status \"0d7cab30451b60576b44a1ba34365114e55c49d95d99c38d84343837f334b346\": rpc error: code = NotFound desc = could not find container \"0d7cab30451b60576b44a1ba34365114e55c49d95d99c38d84343837f334b346\": container with ID starting with 0d7cab30451b60576b44a1ba34365114e55c49d95d99c38d84343837f334b346 not found: ID does not exist" Oct 01 13:02:27 crc kubenswrapper[4727]: I1001 13:02:27.909430 4727 scope.go:117] "RemoveContainer" containerID="685f47256581998be9ed07fc1df6c3407a43fec352be147fc67f87824c3c0dbf" Oct 01 13:02:27 crc kubenswrapper[4727]: E1001 13:02:27.909690 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"685f47256581998be9ed07fc1df6c3407a43fec352be147fc67f87824c3c0dbf\": container with ID starting with 685f47256581998be9ed07fc1df6c3407a43fec352be147fc67f87824c3c0dbf not found: ID does not exist" containerID="685f47256581998be9ed07fc1df6c3407a43fec352be147fc67f87824c3c0dbf" Oct 01 13:02:27 crc kubenswrapper[4727]: I1001 13:02:27.909786 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"685f47256581998be9ed07fc1df6c3407a43fec352be147fc67f87824c3c0dbf"} err="failed to get container status \"685f47256581998be9ed07fc1df6c3407a43fec352be147fc67f87824c3c0dbf\": rpc error: code = NotFound desc = could not find container \"685f47256581998be9ed07fc1df6c3407a43fec352be147fc67f87824c3c0dbf\": container with ID starting with 685f47256581998be9ed07fc1df6c3407a43fec352be147fc67f87824c3c0dbf not found: ID does not exist" Oct 01 13:02:27 crc kubenswrapper[4727]: I1001 13:02:27.909899 4727 scope.go:117] "RemoveContainer" containerID="3ba57423bb4f0d6b2da2c9479f4e92eb6d94fb78c2bbe163e1c0fa0e9b2b7e5a" Oct 01 13:02:27 crc kubenswrapper[4727]: E1001 13:02:27.910331 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ba57423bb4f0d6b2da2c9479f4e92eb6d94fb78c2bbe163e1c0fa0e9b2b7e5a\": container with ID starting with 3ba57423bb4f0d6b2da2c9479f4e92eb6d94fb78c2bbe163e1c0fa0e9b2b7e5a not found: ID does not exist" containerID="3ba57423bb4f0d6b2da2c9479f4e92eb6d94fb78c2bbe163e1c0fa0e9b2b7e5a" Oct 01 13:02:27 crc kubenswrapper[4727]: I1001 13:02:27.910356 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ba57423bb4f0d6b2da2c9479f4e92eb6d94fb78c2bbe163e1c0fa0e9b2b7e5a"} err="failed to get container status \"3ba57423bb4f0d6b2da2c9479f4e92eb6d94fb78c2bbe163e1c0fa0e9b2b7e5a\": rpc error: code = NotFound desc = could not find container \"3ba57423bb4f0d6b2da2c9479f4e92eb6d94fb78c2bbe163e1c0fa0e9b2b7e5a\": container with ID starting with 3ba57423bb4f0d6b2da2c9479f4e92eb6d94fb78c2bbe163e1c0fa0e9b2b7e5a not found: ID does not exist" Oct 01 13:02:28 crc kubenswrapper[4727]: I1001 13:02:28.385194 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8257cd0e-01d2-4769-8f26-d27de521ece3" path="/var/lib/kubelet/pods/8257cd0e-01d2-4769-8f26-d27de521ece3/volumes" Oct 01 13:02:28 crc kubenswrapper[4727]: I1001 13:02:28.846605 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k7cvr" event={"ID":"44d99ed4-f0f6-4597-a224-941f817df121","Type":"ContainerStarted","Data":"26bae13afb80ed82a9f528319a9f50b35d6b57bed45c8989a7fd2a4f5ff4d3aa"} Oct 01 13:02:30 crc kubenswrapper[4727]: I1001 13:02:30.558549 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k7cvr" Oct 01 13:02:30 crc kubenswrapper[4727]: I1001 13:02:30.559062 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k7cvr" Oct 01 13:02:31 crc kubenswrapper[4727]: I1001 13:02:31.616898 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k7cvr" podUID="44d99ed4-f0f6-4597-a224-941f817df121" containerName="registry-server" probeResult="failure" output=< Oct 01 13:02:31 crc kubenswrapper[4727]: timeout: failed to connect service ":50051" within 1s Oct 01 13:02:31 crc kubenswrapper[4727]: > Oct 01 13:02:33 crc kubenswrapper[4727]: I1001 13:02:33.292055 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:02:33 crc kubenswrapper[4727]: I1001 13:02:33.292396 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:02:34 crc kubenswrapper[4727]: I1001 13:02:34.027795 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k7cvr" podStartSLOduration=8.811162645 podStartE2EDuration="54.027775757s" podCreationTimestamp="2025-10-01 13:01:40 +0000 UTC" firstStartedPulling="2025-10-01 13:01:42.345828055 +0000 UTC m=+1480.667182882" lastFinishedPulling="2025-10-01 13:02:27.562441157 +0000 UTC m=+1525.883795994" observedRunningTime="2025-10-01 13:02:28.865853966 +0000 UTC m=+1527.187208833" watchObservedRunningTime="2025-10-01 13:02:34.027775757 +0000 UTC m=+1532.349130594" Oct 01 13:02:34 crc kubenswrapper[4727]: I1001 13:02:34.039341 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-c969-account-create-n4qx8"] Oct 01 13:02:34 crc kubenswrapper[4727]: I1001 13:02:34.051498 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-c969-account-create-n4qx8"] Oct 01 13:02:34 crc kubenswrapper[4727]: I1001 13:02:34.384584 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="409c0051-d099-44b5-97bb-93d7a47a91e6" path="/var/lib/kubelet/pods/409c0051-d099-44b5-97bb-93d7a47a91e6/volumes" Oct 01 13:02:35 crc kubenswrapper[4727]: I1001 13:02:35.028687 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-82e6-account-create-tbgvp"] Oct 01 13:02:35 crc kubenswrapper[4727]: I1001 13:02:35.037941 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-82e6-account-create-tbgvp"] Oct 01 13:02:36 crc kubenswrapper[4727]: I1001 13:02:36.035009 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-691e-account-create-xrs5s"] Oct 01 13:02:36 crc kubenswrapper[4727]: I1001 13:02:36.043373 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-691e-account-create-xrs5s"] Oct 01 13:02:36 crc kubenswrapper[4727]: I1001 13:02:36.385868 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c5853f7-9f25-4027-a1bd-0917f55f0fb5" path="/var/lib/kubelet/pods/1c5853f7-9f25-4027-a1bd-0917f55f0fb5/volumes" Oct 01 13:02:36 crc kubenswrapper[4727]: I1001 13:02:36.387188 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f48d653-b71d-4f18-be4f-c8883b023f59" path="/var/lib/kubelet/pods/4f48d653-b71d-4f18-be4f-c8883b023f59/volumes" Oct 01 13:02:40 crc kubenswrapper[4727]: I1001 13:02:40.606337 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k7cvr" Oct 01 13:02:40 crc kubenswrapper[4727]: I1001 13:02:40.663392 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k7cvr" Oct 01 13:02:40 crc kubenswrapper[4727]: I1001 13:02:40.738386 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k7cvr"] Oct 01 13:02:40 crc kubenswrapper[4727]: I1001 13:02:40.842461 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bcnlj"] Oct 01 13:02:40 crc kubenswrapper[4727]: I1001 13:02:40.842778 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bcnlj" podUID="dac96fe7-54ef-42de-93da-4c8ea9b2f1df" containerName="registry-server" containerID="cri-o://00608454bf12ef580dc74d1a697fc1ce159f9ffc2c1f6b80c385d3fdc8b07c07" gracePeriod=2 Oct 01 13:02:40 crc kubenswrapper[4727]: I1001 13:02:40.987224 4727 generic.go:334] "Generic (PLEG): container finished" podID="dac96fe7-54ef-42de-93da-4c8ea9b2f1df" containerID="00608454bf12ef580dc74d1a697fc1ce159f9ffc2c1f6b80c385d3fdc8b07c07" exitCode=0 Oct 01 13:02:40 crc kubenswrapper[4727]: I1001 13:02:40.987309 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bcnlj" event={"ID":"dac96fe7-54ef-42de-93da-4c8ea9b2f1df","Type":"ContainerDied","Data":"00608454bf12ef580dc74d1a697fc1ce159f9ffc2c1f6b80c385d3fdc8b07c07"} Oct 01 13:02:41 crc kubenswrapper[4727]: I1001 13:02:41.369904 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bcnlj" Oct 01 13:02:41 crc kubenswrapper[4727]: I1001 13:02:41.417381 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxw5c\" (UniqueName: \"kubernetes.io/projected/dac96fe7-54ef-42de-93da-4c8ea9b2f1df-kube-api-access-zxw5c\") pod \"dac96fe7-54ef-42de-93da-4c8ea9b2f1df\" (UID: \"dac96fe7-54ef-42de-93da-4c8ea9b2f1df\") " Oct 01 13:02:41 crc kubenswrapper[4727]: I1001 13:02:41.417859 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dac96fe7-54ef-42de-93da-4c8ea9b2f1df-catalog-content\") pod \"dac96fe7-54ef-42de-93da-4c8ea9b2f1df\" (UID: \"dac96fe7-54ef-42de-93da-4c8ea9b2f1df\") " Oct 01 13:02:41 crc kubenswrapper[4727]: I1001 13:02:41.418020 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dac96fe7-54ef-42de-93da-4c8ea9b2f1df-utilities\") pod \"dac96fe7-54ef-42de-93da-4c8ea9b2f1df\" (UID: \"dac96fe7-54ef-42de-93da-4c8ea9b2f1df\") " Oct 01 13:02:41 crc kubenswrapper[4727]: I1001 13:02:41.419033 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dac96fe7-54ef-42de-93da-4c8ea9b2f1df-utilities" (OuterVolumeSpecName: "utilities") pod "dac96fe7-54ef-42de-93da-4c8ea9b2f1df" (UID: "dac96fe7-54ef-42de-93da-4c8ea9b2f1df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:02:41 crc kubenswrapper[4727]: I1001 13:02:41.435413 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dac96fe7-54ef-42de-93da-4c8ea9b2f1df-kube-api-access-zxw5c" (OuterVolumeSpecName: "kube-api-access-zxw5c") pod "dac96fe7-54ef-42de-93da-4c8ea9b2f1df" (UID: "dac96fe7-54ef-42de-93da-4c8ea9b2f1df"). InnerVolumeSpecName "kube-api-access-zxw5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:02:41 crc kubenswrapper[4727]: I1001 13:02:41.520745 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dac96fe7-54ef-42de-93da-4c8ea9b2f1df-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:02:41 crc kubenswrapper[4727]: I1001 13:02:41.520778 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxw5c\" (UniqueName: \"kubernetes.io/projected/dac96fe7-54ef-42de-93da-4c8ea9b2f1df-kube-api-access-zxw5c\") on node \"crc\" DevicePath \"\"" Oct 01 13:02:41 crc kubenswrapper[4727]: I1001 13:02:41.522319 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dac96fe7-54ef-42de-93da-4c8ea9b2f1df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dac96fe7-54ef-42de-93da-4c8ea9b2f1df" (UID: "dac96fe7-54ef-42de-93da-4c8ea9b2f1df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:02:41 crc kubenswrapper[4727]: I1001 13:02:41.622748 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dac96fe7-54ef-42de-93da-4c8ea9b2f1df-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:02:42 crc kubenswrapper[4727]: I1001 13:02:42.002629 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bcnlj" Oct 01 13:02:42 crc kubenswrapper[4727]: I1001 13:02:42.003152 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bcnlj" event={"ID":"dac96fe7-54ef-42de-93da-4c8ea9b2f1df","Type":"ContainerDied","Data":"e29f8b4be5526269e28144991cbf14d6fbb1884eaf9230f9b70558b0b1dff38e"} Oct 01 13:02:42 crc kubenswrapper[4727]: I1001 13:02:42.003261 4727 scope.go:117] "RemoveContainer" containerID="00608454bf12ef580dc74d1a697fc1ce159f9ffc2c1f6b80c385d3fdc8b07c07" Oct 01 13:02:42 crc kubenswrapper[4727]: I1001 13:02:42.035221 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-cth6t"] Oct 01 13:02:42 crc kubenswrapper[4727]: I1001 13:02:42.038277 4727 scope.go:117] "RemoveContainer" containerID="e64326dd3c07e03c442747a7475ea92b8cc8cb05ea849b4ccddac47839c421dc" Oct 01 13:02:42 crc kubenswrapper[4727]: I1001 13:02:42.051310 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-cth6t"] Oct 01 13:02:42 crc kubenswrapper[4727]: I1001 13:02:42.061593 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bcnlj"] Oct 01 13:02:42 crc kubenswrapper[4727]: I1001 13:02:42.069583 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bcnlj"] Oct 01 13:02:42 crc kubenswrapper[4727]: I1001 13:02:42.084181 4727 scope.go:117] "RemoveContainer" containerID="3e1915404c2d754c82cdfedae65f9c48cadb886978e078dc27f770c02a0922d5" Oct 01 13:02:42 crc kubenswrapper[4727]: I1001 13:02:42.382803 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04b59d75-3cf2-451d-b303-07dac30964e5" path="/var/lib/kubelet/pods/04b59d75-3cf2-451d-b303-07dac30964e5/volumes" Oct 01 13:02:42 crc kubenswrapper[4727]: I1001 13:02:42.383893 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dac96fe7-54ef-42de-93da-4c8ea9b2f1df" path="/var/lib/kubelet/pods/dac96fe7-54ef-42de-93da-4c8ea9b2f1df/volumes" Oct 01 13:02:43 crc kubenswrapper[4727]: I1001 13:02:43.037983 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-75b77"] Oct 01 13:02:43 crc kubenswrapper[4727]: I1001 13:02:43.047826 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-75b77"] Oct 01 13:02:43 crc kubenswrapper[4727]: I1001 13:02:43.057072 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-btgc6"] Oct 01 13:02:43 crc kubenswrapper[4727]: I1001 13:02:43.067065 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-btgc6"] Oct 01 13:02:44 crc kubenswrapper[4727]: I1001 13:02:44.391508 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5e1c176-87f7-401f-9137-00d70f843212" path="/var/lib/kubelet/pods/d5e1c176-87f7-401f-9137-00d70f843212/volumes" Oct 01 13:02:44 crc kubenswrapper[4727]: I1001 13:02:44.392194 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdf7d4ae-6ebf-4bdb-9a09-8aae270477a9" path="/var/lib/kubelet/pods/fdf7d4ae-6ebf-4bdb-9a09-8aae270477a9/volumes" Oct 01 13:02:56 crc kubenswrapper[4727]: I1001 13:02:56.045106 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-h9swf"] Oct 01 13:02:56 crc kubenswrapper[4727]: I1001 13:02:56.055914 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-ce84-account-create-7bvhl"] Oct 01 13:02:56 crc kubenswrapper[4727]: I1001 13:02:56.068630 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-ce84-account-create-7bvhl"] Oct 01 13:02:56 crc kubenswrapper[4727]: I1001 13:02:56.080853 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-h9swf"] Oct 01 13:02:56 crc kubenswrapper[4727]: I1001 13:02:56.387138 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="615b1b59-cd92-4d09-bce0-5c3ee394a7b3" path="/var/lib/kubelet/pods/615b1b59-cd92-4d09-bce0-5c3ee394a7b3/volumes" Oct 01 13:02:56 crc kubenswrapper[4727]: I1001 13:02:56.388251 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a571990-0e3c-4dc3-805f-5620123cca26" path="/var/lib/kubelet/pods/9a571990-0e3c-4dc3-805f-5620123cca26/volumes" Oct 01 13:02:59 crc kubenswrapper[4727]: I1001 13:02:59.026865 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-22e1-account-create-rxdnh"] Oct 01 13:02:59 crc kubenswrapper[4727]: I1001 13:02:59.034753 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-22e1-account-create-rxdnh"] Oct 01 13:03:00 crc kubenswrapper[4727]: I1001 13:03:00.383770 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7255c8b-01a2-4ab2-82cf-1480602a1083" path="/var/lib/kubelet/pods/d7255c8b-01a2-4ab2-82cf-1480602a1083/volumes" Oct 01 13:03:03 crc kubenswrapper[4727]: I1001 13:03:03.292270 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:03:03 crc kubenswrapper[4727]: I1001 13:03:03.292671 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:03:03 crc kubenswrapper[4727]: I1001 13:03:03.292725 4727 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" Oct 01 13:03:03 crc kubenswrapper[4727]: I1001 13:03:03.293632 4727 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6"} pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:03:03 crc kubenswrapper[4727]: I1001 13:03:03.293684 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" containerID="cri-o://34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" gracePeriod=600 Oct 01 13:03:03 crc kubenswrapper[4727]: E1001 13:03:03.946136 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:03:04 crc kubenswrapper[4727]: I1001 13:03:04.027961 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-9250-account-create-7h2cw"] Oct 01 13:03:04 crc kubenswrapper[4727]: I1001 13:03:04.037063 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-9250-account-create-7h2cw"] Oct 01 13:03:04 crc kubenswrapper[4727]: I1001 13:03:04.217963 4727 generic.go:334] "Generic (PLEG): container finished" podID="d18290ae-64a5-44a5-a704-90977d85852b" containerID="34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" exitCode=0 Oct 01 13:03:04 crc kubenswrapper[4727]: I1001 13:03:04.218033 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" event={"ID":"d18290ae-64a5-44a5-a704-90977d85852b","Type":"ContainerDied","Data":"34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6"} Oct 01 13:03:04 crc kubenswrapper[4727]: I1001 13:03:04.218072 4727 scope.go:117] "RemoveContainer" containerID="cacd2a9209dd857fc1890a57e560a24e6efca70576638e54f6197ee82d5463f5" Oct 01 13:03:04 crc kubenswrapper[4727]: I1001 13:03:04.218769 4727 scope.go:117] "RemoveContainer" containerID="34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" Oct 01 13:03:04 crc kubenswrapper[4727]: E1001 13:03:04.219092 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:03:04 crc kubenswrapper[4727]: I1001 13:03:04.393713 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9819e121-08cf-4bc2-ab90-1560b86b3cd5" path="/var/lib/kubelet/pods/9819e121-08cf-4bc2-ab90-1560b86b3cd5/volumes" Oct 01 13:03:08 crc kubenswrapper[4727]: I1001 13:03:08.028112 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-w52kz"] Oct 01 13:03:08 crc kubenswrapper[4727]: I1001 13:03:08.035839 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-w52kz"] Oct 01 13:03:08 crc kubenswrapper[4727]: I1001 13:03:08.392508 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cf7db27-ff87-481e-a776-cb171e57f4b9" path="/var/lib/kubelet/pods/5cf7db27-ff87-481e-a776-cb171e57f4b9/volumes" Oct 01 13:03:09 crc kubenswrapper[4727]: I1001 13:03:09.673187 4727 scope.go:117] "RemoveContainer" containerID="4d095756fe013c4991248c3b64acdbc2cadcd44c4b3611d8467380c33f620472" Oct 01 13:03:09 crc kubenswrapper[4727]: I1001 13:03:09.701493 4727 scope.go:117] "RemoveContainer" containerID="016eac71034346e62c68c783917ba4a1143cc1edc129602bba128f742185407c" Oct 01 13:03:09 crc kubenswrapper[4727]: I1001 13:03:09.747130 4727 scope.go:117] "RemoveContainer" containerID="bf23e0eabc43f46342e3bb785ad57dc13f4253b398318a5778337cf75e0573a1" Oct 01 13:03:09 crc kubenswrapper[4727]: I1001 13:03:09.808805 4727 scope.go:117] "RemoveContainer" containerID="610be50434a4cd2af19ada4d94fda4e0e0720676a36cd500c14d25fa3f0f9b69" Oct 01 13:03:09 crc kubenswrapper[4727]: I1001 13:03:09.840369 4727 scope.go:117] "RemoveContainer" containerID="c89584cce96dda52b5be727159cc31fe921bf13ebf960fdf9dcc8d50a5d5cefb" Oct 01 13:03:09 crc kubenswrapper[4727]: I1001 13:03:09.883397 4727 scope.go:117] "RemoveContainer" containerID="82c4031302220502d0e46eca03c679cab70717e6c7b534805db20a4238a158cd" Oct 01 13:03:09 crc kubenswrapper[4727]: I1001 13:03:09.921300 4727 scope.go:117] "RemoveContainer" containerID="08d10bd2e8da08a15291a224e9194caa4505321d5f0cbd297ac3f2d3f5561a80" Oct 01 13:03:09 crc kubenswrapper[4727]: I1001 13:03:09.945919 4727 scope.go:117] "RemoveContainer" containerID="9a589aab940c5d3eb6b9317b4e6cfbe014f2dc533aaf9c66e3830925ed99ebc0" Oct 01 13:03:09 crc kubenswrapper[4727]: I1001 13:03:09.969310 4727 scope.go:117] "RemoveContainer" containerID="503e5b2d580b683cfb5a9fefc83eb21a7b99d8005b38d58261ba09724dce2424" Oct 01 13:03:09 crc kubenswrapper[4727]: I1001 13:03:09.990495 4727 scope.go:117] "RemoveContainer" containerID="784f7336b16d43eac26b41ecc990efe969652be25251ea9d9cb7fba26323dcb8" Oct 01 13:03:10 crc kubenswrapper[4727]: I1001 13:03:10.022913 4727 scope.go:117] "RemoveContainer" containerID="5590afea4d629222d948fc2cd4248dd205bef8cfb95e13be86ef218ec42481b9" Oct 01 13:03:10 crc kubenswrapper[4727]: I1001 13:03:10.044129 4727 scope.go:117] "RemoveContainer" containerID="d01dbe5e7d3fa66afa6e2d63083986a5682b9751e62e8d9115fb1e5fe0317a27" Oct 01 13:03:10 crc kubenswrapper[4727]: I1001 13:03:10.083866 4727 scope.go:117] "RemoveContainer" containerID="f11dceb574f0003ce6d97d923ac3d0852a169bb4c1ddf18b53905f14898a4351" Oct 01 13:03:10 crc kubenswrapper[4727]: I1001 13:03:10.113284 4727 scope.go:117] "RemoveContainer" containerID="9a1f32e45bd50fa3cc563f87b03d0aa1f5f0c9f467e0658706d1657118a6ff31" Oct 01 13:03:19 crc kubenswrapper[4727]: I1001 13:03:19.028967 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-czk9d"] Oct 01 13:03:19 crc kubenswrapper[4727]: I1001 13:03:19.037242 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-czk9d"] Oct 01 13:03:19 crc kubenswrapper[4727]: I1001 13:03:19.372661 4727 scope.go:117] "RemoveContainer" containerID="34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" Oct 01 13:03:19 crc kubenswrapper[4727]: E1001 13:03:19.372979 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:03:20 crc kubenswrapper[4727]: I1001 13:03:20.385157 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3f868f9-d4a4-41e8-ac05-f0a1c659e18b" path="/var/lib/kubelet/pods/d3f868f9-d4a4-41e8-ac05-f0a1c659e18b/volumes" Oct 01 13:03:31 crc kubenswrapper[4727]: I1001 13:03:31.373513 4727 scope.go:117] "RemoveContainer" containerID="34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" Oct 01 13:03:31 crc kubenswrapper[4727]: E1001 13:03:31.374214 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:03:33 crc kubenswrapper[4727]: I1001 13:03:33.037859 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-2qtjv"] Oct 01 13:03:33 crc kubenswrapper[4727]: I1001 13:03:33.045794 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-2qtjv"] Oct 01 13:03:34 crc kubenswrapper[4727]: I1001 13:03:34.035471 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-rq4c7"] Oct 01 13:03:34 crc kubenswrapper[4727]: I1001 13:03:34.045403 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-rq4c7"] Oct 01 13:03:34 crc kubenswrapper[4727]: I1001 13:03:34.394519 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="087bee3f-a34f-43ca-ac4b-b3e46e068898" path="/var/lib/kubelet/pods/087bee3f-a34f-43ca-ac4b-b3e46e068898/volumes" Oct 01 13:03:34 crc kubenswrapper[4727]: I1001 13:03:34.395901 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eb9c560-a9b2-4243-b59a-b40142e48739" path="/var/lib/kubelet/pods/4eb9c560-a9b2-4243-b59a-b40142e48739/volumes" Oct 01 13:03:42 crc kubenswrapper[4727]: I1001 13:03:42.379291 4727 scope.go:117] "RemoveContainer" containerID="34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" Oct 01 13:03:42 crc kubenswrapper[4727]: E1001 13:03:42.380325 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:03:45 crc kubenswrapper[4727]: I1001 13:03:45.631501 4727 generic.go:334] "Generic (PLEG): container finished" podID="0f819364-69c9-47d2-9876-82a3081ab579" containerID="5cc2f5b3bda6f94e9c2977637d751f200cb214c5cafc06707652e85d57f99366" exitCode=0 Oct 01 13:03:45 crc kubenswrapper[4727]: I1001 13:03:45.631624 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj" event={"ID":"0f819364-69c9-47d2-9876-82a3081ab579","Type":"ContainerDied","Data":"5cc2f5b3bda6f94e9c2977637d751f200cb214c5cafc06707652e85d57f99366"} Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.121180 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.244307 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szj9g\" (UniqueName: \"kubernetes.io/projected/0f819364-69c9-47d2-9876-82a3081ab579-kube-api-access-szj9g\") pod \"0f819364-69c9-47d2-9876-82a3081ab579\" (UID: \"0f819364-69c9-47d2-9876-82a3081ab579\") " Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.244478 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f819364-69c9-47d2-9876-82a3081ab579-inventory\") pod \"0f819364-69c9-47d2-9876-82a3081ab579\" (UID: \"0f819364-69c9-47d2-9876-82a3081ab579\") " Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.244611 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f819364-69c9-47d2-9876-82a3081ab579-ssh-key\") pod \"0f819364-69c9-47d2-9876-82a3081ab579\" (UID: \"0f819364-69c9-47d2-9876-82a3081ab579\") " Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.250281 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f819364-69c9-47d2-9876-82a3081ab579-kube-api-access-szj9g" (OuterVolumeSpecName: "kube-api-access-szj9g") pod "0f819364-69c9-47d2-9876-82a3081ab579" (UID: "0f819364-69c9-47d2-9876-82a3081ab579"). InnerVolumeSpecName "kube-api-access-szj9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.270699 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f819364-69c9-47d2-9876-82a3081ab579-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0f819364-69c9-47d2-9876-82a3081ab579" (UID: "0f819364-69c9-47d2-9876-82a3081ab579"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.272987 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f819364-69c9-47d2-9876-82a3081ab579-inventory" (OuterVolumeSpecName: "inventory") pod "0f819364-69c9-47d2-9876-82a3081ab579" (UID: "0f819364-69c9-47d2-9876-82a3081ab579"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.347164 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szj9g\" (UniqueName: \"kubernetes.io/projected/0f819364-69c9-47d2-9876-82a3081ab579-kube-api-access-szj9g\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.347211 4727 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0f819364-69c9-47d2-9876-82a3081ab579-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.347222 4727 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0f819364-69c9-47d2-9876-82a3081ab579-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.654460 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj" event={"ID":"0f819364-69c9-47d2-9876-82a3081ab579","Type":"ContainerDied","Data":"acf64d658dcfd13de416191266dfa1b49a613bb9119390bb34632ffc793bdda3"} Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.654533 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acf64d658dcfd13de416191266dfa1b49a613bb9119390bb34632ffc793bdda3" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.654547 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.744482 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt"] Oct 01 13:03:47 crc kubenswrapper[4727]: E1001 13:03:47.745075 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f819364-69c9-47d2-9876-82a3081ab579" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.745103 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f819364-69c9-47d2-9876-82a3081ab579" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 01 13:03:47 crc kubenswrapper[4727]: E1001 13:03:47.745135 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8257cd0e-01d2-4769-8f26-d27de521ece3" containerName="registry-server" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.745144 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="8257cd0e-01d2-4769-8f26-d27de521ece3" containerName="registry-server" Oct 01 13:03:47 crc kubenswrapper[4727]: E1001 13:03:47.745163 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dac96fe7-54ef-42de-93da-4c8ea9b2f1df" containerName="extract-utilities" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.745172 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="dac96fe7-54ef-42de-93da-4c8ea9b2f1df" containerName="extract-utilities" Oct 01 13:03:47 crc kubenswrapper[4727]: E1001 13:03:47.745198 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8257cd0e-01d2-4769-8f26-d27de521ece3" containerName="extract-utilities" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.745207 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="8257cd0e-01d2-4769-8f26-d27de521ece3" containerName="extract-utilities" Oct 01 13:03:47 crc kubenswrapper[4727]: E1001 13:03:47.745218 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dac96fe7-54ef-42de-93da-4c8ea9b2f1df" containerName="extract-content" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.745227 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="dac96fe7-54ef-42de-93da-4c8ea9b2f1df" containerName="extract-content" Oct 01 13:03:47 crc kubenswrapper[4727]: E1001 13:03:47.745243 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dac96fe7-54ef-42de-93da-4c8ea9b2f1df" containerName="registry-server" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.745251 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="dac96fe7-54ef-42de-93da-4c8ea9b2f1df" containerName="registry-server" Oct 01 13:03:47 crc kubenswrapper[4727]: E1001 13:03:47.745262 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8257cd0e-01d2-4769-8f26-d27de521ece3" containerName="extract-content" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.745270 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="8257cd0e-01d2-4769-8f26-d27de521ece3" containerName="extract-content" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.745499 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="dac96fe7-54ef-42de-93da-4c8ea9b2f1df" containerName="registry-server" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.745522 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="8257cd0e-01d2-4769-8f26-d27de521ece3" containerName="registry-server" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.745545 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f819364-69c9-47d2-9876-82a3081ab579" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.746416 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.750183 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.750305 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.750765 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jcjb6" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.750826 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.764317 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt"] Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.860393 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5vr5\" (UniqueName: \"kubernetes.io/projected/2d0a1f80-62ee-4cc4-9a3b-48e7289508c4-kube-api-access-n5vr5\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt\" (UID: \"2d0a1f80-62ee-4cc4-9a3b-48e7289508c4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.860461 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d0a1f80-62ee-4cc4-9a3b-48e7289508c4-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt\" (UID: \"2d0a1f80-62ee-4cc4-9a3b-48e7289508c4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.860535 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d0a1f80-62ee-4cc4-9a3b-48e7289508c4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt\" (UID: \"2d0a1f80-62ee-4cc4-9a3b-48e7289508c4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.962629 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5vr5\" (UniqueName: \"kubernetes.io/projected/2d0a1f80-62ee-4cc4-9a3b-48e7289508c4-kube-api-access-n5vr5\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt\" (UID: \"2d0a1f80-62ee-4cc4-9a3b-48e7289508c4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.962726 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d0a1f80-62ee-4cc4-9a3b-48e7289508c4-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt\" (UID: \"2d0a1f80-62ee-4cc4-9a3b-48e7289508c4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.962771 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d0a1f80-62ee-4cc4-9a3b-48e7289508c4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt\" (UID: \"2d0a1f80-62ee-4cc4-9a3b-48e7289508c4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.969411 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d0a1f80-62ee-4cc4-9a3b-48e7289508c4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt\" (UID: \"2d0a1f80-62ee-4cc4-9a3b-48e7289508c4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.969496 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d0a1f80-62ee-4cc4-9a3b-48e7289508c4-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt\" (UID: \"2d0a1f80-62ee-4cc4-9a3b-48e7289508c4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt" Oct 01 13:03:47 crc kubenswrapper[4727]: I1001 13:03:47.981376 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5vr5\" (UniqueName: \"kubernetes.io/projected/2d0a1f80-62ee-4cc4-9a3b-48e7289508c4-kube-api-access-n5vr5\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt\" (UID: \"2d0a1f80-62ee-4cc4-9a3b-48e7289508c4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt" Oct 01 13:03:48 crc kubenswrapper[4727]: I1001 13:03:48.076028 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt" Oct 01 13:03:48 crc kubenswrapper[4727]: I1001 13:03:48.581876 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt"] Oct 01 13:03:48 crc kubenswrapper[4727]: I1001 13:03:48.663211 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt" event={"ID":"2d0a1f80-62ee-4cc4-9a3b-48e7289508c4","Type":"ContainerStarted","Data":"3779bfce4b1741e643006f2163f28ff4f8752be6ebc3bca8c9bf34fcb10f2e80"} Oct 01 13:03:49 crc kubenswrapper[4727]: I1001 13:03:49.676177 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt" event={"ID":"2d0a1f80-62ee-4cc4-9a3b-48e7289508c4","Type":"ContainerStarted","Data":"621a4e508115d45b302dc157d33d61e544b029703d98638f3c33f4a3b8b0d30b"} Oct 01 13:03:49 crc kubenswrapper[4727]: I1001 13:03:49.703568 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt" podStartSLOduration=1.98045568 podStartE2EDuration="2.703543675s" podCreationTimestamp="2025-10-01 13:03:47 +0000 UTC" firstStartedPulling="2025-10-01 13:03:48.592069491 +0000 UTC m=+1606.913424328" lastFinishedPulling="2025-10-01 13:03:49.315157486 +0000 UTC m=+1607.636512323" observedRunningTime="2025-10-01 13:03:49.699800788 +0000 UTC m=+1608.021155635" watchObservedRunningTime="2025-10-01 13:03:49.703543675 +0000 UTC m=+1608.024898512" Oct 01 13:03:55 crc kubenswrapper[4727]: I1001 13:03:55.372849 4727 scope.go:117] "RemoveContainer" containerID="34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" Oct 01 13:03:55 crc kubenswrapper[4727]: E1001 13:03:55.374352 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:03:57 crc kubenswrapper[4727]: I1001 13:03:57.045725 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-nkrx8"] Oct 01 13:03:57 crc kubenswrapper[4727]: I1001 13:03:57.053965 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-nkrx8"] Oct 01 13:03:58 crc kubenswrapper[4727]: I1001 13:03:58.387548 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4da191c-6509-4bb7-b9b2-344f8224ae58" path="/var/lib/kubelet/pods/d4da191c-6509-4bb7-b9b2-344f8224ae58/volumes" Oct 01 13:03:59 crc kubenswrapper[4727]: I1001 13:03:59.045950 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-74wc5"] Oct 01 13:03:59 crc kubenswrapper[4727]: I1001 13:03:59.046386 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-74wc5"] Oct 01 13:04:00 crc kubenswrapper[4727]: I1001 13:04:00.386456 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5746629a-ce5e-4404-8996-165034633b9e" path="/var/lib/kubelet/pods/5746629a-ce5e-4404-8996-165034633b9e/volumes" Oct 01 13:04:06 crc kubenswrapper[4727]: I1001 13:04:06.372852 4727 scope.go:117] "RemoveContainer" containerID="34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" Oct 01 13:04:06 crc kubenswrapper[4727]: E1001 13:04:06.373623 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:04:10 crc kubenswrapper[4727]: I1001 13:04:10.426421 4727 scope.go:117] "RemoveContainer" containerID="e5d027d35ac788cdd3b8fd098e08727c51f90462c0daaa21863e69e95c618ca0" Oct 01 13:04:10 crc kubenswrapper[4727]: I1001 13:04:10.476170 4727 scope.go:117] "RemoveContainer" containerID="031f6669cf8749f3fcf4677257b33ab960904b0aea2446f27aae497cffcf9f2b" Oct 01 13:04:10 crc kubenswrapper[4727]: I1001 13:04:10.528319 4727 scope.go:117] "RemoveContainer" containerID="9df9fca93de18a593d6c01b4c56a4f0c1ab4ae6be3d3bb88e43ece750ad4d56c" Oct 01 13:04:10 crc kubenswrapper[4727]: I1001 13:04:10.603760 4727 scope.go:117] "RemoveContainer" containerID="5972f220b28b5bac03f6c034fe2fbae406896dcbdd6967f9eb5538b6ffdc638f" Oct 01 13:04:10 crc kubenswrapper[4727]: I1001 13:04:10.652572 4727 scope.go:117] "RemoveContainer" containerID="caabaacb4f936f97251ae8d7461306a7c895a2418f40a12965e723fecf38b167" Oct 01 13:04:10 crc kubenswrapper[4727]: I1001 13:04:10.680700 4727 scope.go:117] "RemoveContainer" containerID="60bc5b53364f3a052513adb74535bdb4845204bbd4e01a6a62500eeac82e9b0b" Oct 01 13:04:10 crc kubenswrapper[4727]: I1001 13:04:10.712777 4727 scope.go:117] "RemoveContainer" containerID="563afdd8ffad3ca310f8579bbc461b2ed2209a251299748fa59749bfa7519cec" Oct 01 13:04:21 crc kubenswrapper[4727]: I1001 13:04:21.372570 4727 scope.go:117] "RemoveContainer" containerID="34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" Oct 01 13:04:21 crc kubenswrapper[4727]: E1001 13:04:21.373243 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:04:24 crc kubenswrapper[4727]: I1001 13:04:24.050536 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-4jc2p"] Oct 01 13:04:24 crc kubenswrapper[4727]: I1001 13:04:24.060545 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-nn6ql"] Oct 01 13:04:24 crc kubenswrapper[4727]: I1001 13:04:24.069287 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-7zj4h"] Oct 01 13:04:24 crc kubenswrapper[4727]: I1001 13:04:24.079371 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-4jc2p"] Oct 01 13:04:24 crc kubenswrapper[4727]: I1001 13:04:24.087628 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-nn6ql"] Oct 01 13:04:24 crc kubenswrapper[4727]: I1001 13:04:24.096368 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-7zj4h"] Oct 01 13:04:24 crc kubenswrapper[4727]: I1001 13:04:24.384317 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="586aeab7-2b38-400f-827b-6ea16b3bf9c4" path="/var/lib/kubelet/pods/586aeab7-2b38-400f-827b-6ea16b3bf9c4/volumes" Oct 01 13:04:24 crc kubenswrapper[4727]: I1001 13:04:24.384983 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8df368e0-d59d-40bd-8aea-d68ab67ba406" path="/var/lib/kubelet/pods/8df368e0-d59d-40bd-8aea-d68ab67ba406/volumes" Oct 01 13:04:24 crc kubenswrapper[4727]: I1001 13:04:24.385527 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1c4e0f3-9f27-4304-a848-6b3482161126" path="/var/lib/kubelet/pods/f1c4e0f3-9f27-4304-a848-6b3482161126/volumes" Oct 01 13:04:33 crc kubenswrapper[4727]: I1001 13:04:33.372821 4727 scope.go:117] "RemoveContainer" containerID="34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" Oct 01 13:04:33 crc kubenswrapper[4727]: E1001 13:04:33.373890 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:04:34 crc kubenswrapper[4727]: I1001 13:04:34.032558 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-3933-account-create-b5cwg"] Oct 01 13:04:34 crc kubenswrapper[4727]: I1001 13:04:34.039613 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cdda-account-create-s6lzz"] Oct 01 13:04:34 crc kubenswrapper[4727]: I1001 13:04:34.049480 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0ba1-account-create-6qhvv"] Oct 01 13:04:34 crc kubenswrapper[4727]: I1001 13:04:34.059917 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-3933-account-create-b5cwg"] Oct 01 13:04:34 crc kubenswrapper[4727]: I1001 13:04:34.068653 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cdda-account-create-s6lzz"] Oct 01 13:04:34 crc kubenswrapper[4727]: I1001 13:04:34.076591 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0ba1-account-create-6qhvv"] Oct 01 13:04:34 crc kubenswrapper[4727]: I1001 13:04:34.384782 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="146661f6-7cea-4d6c-904b-8252681753cb" path="/var/lib/kubelet/pods/146661f6-7cea-4d6c-904b-8252681753cb/volumes" Oct 01 13:04:34 crc kubenswrapper[4727]: I1001 13:04:34.386259 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c100de0-2ce7-4c60-b790-57c91a64f9c5" path="/var/lib/kubelet/pods/6c100de0-2ce7-4c60-b790-57c91a64f9c5/volumes" Oct 01 13:04:34 crc kubenswrapper[4727]: I1001 13:04:34.386992 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbff18ff-5109-45fb-8bbd-36e660aba31e" path="/var/lib/kubelet/pods/dbff18ff-5109-45fb-8bbd-36e660aba31e/volumes" Oct 01 13:04:44 crc kubenswrapper[4727]: I1001 13:04:44.373053 4727 scope.go:117] "RemoveContainer" containerID="34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" Oct 01 13:04:44 crc kubenswrapper[4727]: E1001 13:04:44.375218 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:04:56 crc kubenswrapper[4727]: I1001 13:04:56.295876 4727 generic.go:334] "Generic (PLEG): container finished" podID="2d0a1f80-62ee-4cc4-9a3b-48e7289508c4" containerID="621a4e508115d45b302dc157d33d61e544b029703d98638f3c33f4a3b8b0d30b" exitCode=0 Oct 01 13:04:56 crc kubenswrapper[4727]: I1001 13:04:56.296008 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt" event={"ID":"2d0a1f80-62ee-4cc4-9a3b-48e7289508c4","Type":"ContainerDied","Data":"621a4e508115d45b302dc157d33d61e544b029703d98638f3c33f4a3b8b0d30b"} Oct 01 13:04:57 crc kubenswrapper[4727]: I1001 13:04:57.813053 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt" Oct 01 13:04:57 crc kubenswrapper[4727]: I1001 13:04:57.949376 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d0a1f80-62ee-4cc4-9a3b-48e7289508c4-inventory\") pod \"2d0a1f80-62ee-4cc4-9a3b-48e7289508c4\" (UID: \"2d0a1f80-62ee-4cc4-9a3b-48e7289508c4\") " Oct 01 13:04:57 crc kubenswrapper[4727]: I1001 13:04:57.949648 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5vr5\" (UniqueName: \"kubernetes.io/projected/2d0a1f80-62ee-4cc4-9a3b-48e7289508c4-kube-api-access-n5vr5\") pod \"2d0a1f80-62ee-4cc4-9a3b-48e7289508c4\" (UID: \"2d0a1f80-62ee-4cc4-9a3b-48e7289508c4\") " Oct 01 13:04:57 crc kubenswrapper[4727]: I1001 13:04:57.949775 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d0a1f80-62ee-4cc4-9a3b-48e7289508c4-ssh-key\") pod \"2d0a1f80-62ee-4cc4-9a3b-48e7289508c4\" (UID: \"2d0a1f80-62ee-4cc4-9a3b-48e7289508c4\") " Oct 01 13:04:57 crc kubenswrapper[4727]: I1001 13:04:57.957561 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d0a1f80-62ee-4cc4-9a3b-48e7289508c4-kube-api-access-n5vr5" (OuterVolumeSpecName: "kube-api-access-n5vr5") pod "2d0a1f80-62ee-4cc4-9a3b-48e7289508c4" (UID: "2d0a1f80-62ee-4cc4-9a3b-48e7289508c4"). InnerVolumeSpecName "kube-api-access-n5vr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:04:57 crc kubenswrapper[4727]: I1001 13:04:57.976865 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d0a1f80-62ee-4cc4-9a3b-48e7289508c4-inventory" (OuterVolumeSpecName: "inventory") pod "2d0a1f80-62ee-4cc4-9a3b-48e7289508c4" (UID: "2d0a1f80-62ee-4cc4-9a3b-48e7289508c4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:04:57 crc kubenswrapper[4727]: I1001 13:04:57.977601 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d0a1f80-62ee-4cc4-9a3b-48e7289508c4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2d0a1f80-62ee-4cc4-9a3b-48e7289508c4" (UID: "2d0a1f80-62ee-4cc4-9a3b-48e7289508c4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:04:58 crc kubenswrapper[4727]: I1001 13:04:58.052307 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5vr5\" (UniqueName: \"kubernetes.io/projected/2d0a1f80-62ee-4cc4-9a3b-48e7289508c4-kube-api-access-n5vr5\") on node \"crc\" DevicePath \"\"" Oct 01 13:04:58 crc kubenswrapper[4727]: I1001 13:04:58.052346 4727 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d0a1f80-62ee-4cc4-9a3b-48e7289508c4-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:04:58 crc kubenswrapper[4727]: I1001 13:04:58.052357 4727 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d0a1f80-62ee-4cc4-9a3b-48e7289508c4-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:04:58 crc kubenswrapper[4727]: I1001 13:04:58.316086 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt" event={"ID":"2d0a1f80-62ee-4cc4-9a3b-48e7289508c4","Type":"ContainerDied","Data":"3779bfce4b1741e643006f2163f28ff4f8752be6ebc3bca8c9bf34fcb10f2e80"} Oct 01 13:04:58 crc kubenswrapper[4727]: I1001 13:04:58.316136 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3779bfce4b1741e643006f2163f28ff4f8752be6ebc3bca8c9bf34fcb10f2e80" Oct 01 13:04:58 crc kubenswrapper[4727]: I1001 13:04:58.316176 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt" Oct 01 13:04:58 crc kubenswrapper[4727]: I1001 13:04:58.394725 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp"] Oct 01 13:04:58 crc kubenswrapper[4727]: E1001 13:04:58.395268 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d0a1f80-62ee-4cc4-9a3b-48e7289508c4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:04:58 crc kubenswrapper[4727]: I1001 13:04:58.395295 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d0a1f80-62ee-4cc4-9a3b-48e7289508c4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:04:58 crc kubenswrapper[4727]: I1001 13:04:58.395528 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d0a1f80-62ee-4cc4-9a3b-48e7289508c4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:04:58 crc kubenswrapper[4727]: I1001 13:04:58.396408 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp" Oct 01 13:04:58 crc kubenswrapper[4727]: I1001 13:04:58.398264 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:04:58 crc kubenswrapper[4727]: I1001 13:04:58.398685 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:04:58 crc kubenswrapper[4727]: I1001 13:04:58.398784 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jcjb6" Oct 01 13:04:58 crc kubenswrapper[4727]: I1001 13:04:58.399350 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:04:58 crc kubenswrapper[4727]: I1001 13:04:58.405100 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp"] Oct 01 13:04:58 crc kubenswrapper[4727]: I1001 13:04:58.459554 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnf86\" (UniqueName: \"kubernetes.io/projected/8b36bd6d-4297-4c41-a010-4d3b10e169b2-kube-api-access-cnf86\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp\" (UID: \"8b36bd6d-4297-4c41-a010-4d3b10e169b2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp" Oct 01 13:04:58 crc kubenswrapper[4727]: I1001 13:04:58.459642 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b36bd6d-4297-4c41-a010-4d3b10e169b2-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp\" (UID: \"8b36bd6d-4297-4c41-a010-4d3b10e169b2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp" Oct 01 13:04:58 crc kubenswrapper[4727]: I1001 13:04:58.459782 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b36bd6d-4297-4c41-a010-4d3b10e169b2-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp\" (UID: \"8b36bd6d-4297-4c41-a010-4d3b10e169b2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp" Oct 01 13:04:58 crc kubenswrapper[4727]: I1001 13:04:58.562108 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b36bd6d-4297-4c41-a010-4d3b10e169b2-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp\" (UID: \"8b36bd6d-4297-4c41-a010-4d3b10e169b2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp" Oct 01 13:04:58 crc kubenswrapper[4727]: I1001 13:04:58.562211 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnf86\" (UniqueName: \"kubernetes.io/projected/8b36bd6d-4297-4c41-a010-4d3b10e169b2-kube-api-access-cnf86\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp\" (UID: \"8b36bd6d-4297-4c41-a010-4d3b10e169b2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp" Oct 01 13:04:58 crc kubenswrapper[4727]: I1001 13:04:58.562275 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b36bd6d-4297-4c41-a010-4d3b10e169b2-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp\" (UID: \"8b36bd6d-4297-4c41-a010-4d3b10e169b2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp" Oct 01 13:04:58 crc kubenswrapper[4727]: I1001 13:04:58.568976 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b36bd6d-4297-4c41-a010-4d3b10e169b2-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp\" (UID: \"8b36bd6d-4297-4c41-a010-4d3b10e169b2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp" Oct 01 13:04:58 crc kubenswrapper[4727]: I1001 13:04:58.570504 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b36bd6d-4297-4c41-a010-4d3b10e169b2-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp\" (UID: \"8b36bd6d-4297-4c41-a010-4d3b10e169b2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp" Oct 01 13:04:58 crc kubenswrapper[4727]: I1001 13:04:58.579502 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnf86\" (UniqueName: \"kubernetes.io/projected/8b36bd6d-4297-4c41-a010-4d3b10e169b2-kube-api-access-cnf86\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp\" (UID: \"8b36bd6d-4297-4c41-a010-4d3b10e169b2\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp" Oct 01 13:04:58 crc kubenswrapper[4727]: I1001 13:04:58.724372 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp" Oct 01 13:04:59 crc kubenswrapper[4727]: I1001 13:04:59.242075 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp"] Oct 01 13:04:59 crc kubenswrapper[4727]: I1001 13:04:59.325048 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp" event={"ID":"8b36bd6d-4297-4c41-a010-4d3b10e169b2","Type":"ContainerStarted","Data":"30294f9e5103937f784d751be136855c9704eac8fb6896d15e086023802dccbe"} Oct 01 13:04:59 crc kubenswrapper[4727]: I1001 13:04:59.372402 4727 scope.go:117] "RemoveContainer" containerID="34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" Oct 01 13:04:59 crc kubenswrapper[4727]: E1001 13:04:59.372666 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:05:00 crc kubenswrapper[4727]: I1001 13:05:00.335255 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp" event={"ID":"8b36bd6d-4297-4c41-a010-4d3b10e169b2","Type":"ContainerStarted","Data":"5227e41706fe22caca004dd43f8425ad2a85e0debd4e3023254df724a1350c26"} Oct 01 13:05:05 crc kubenswrapper[4727]: I1001 13:05:05.373682 4727 generic.go:334] "Generic (PLEG): container finished" podID="8b36bd6d-4297-4c41-a010-4d3b10e169b2" containerID="5227e41706fe22caca004dd43f8425ad2a85e0debd4e3023254df724a1350c26" exitCode=0 Oct 01 13:05:05 crc kubenswrapper[4727]: I1001 13:05:05.373760 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp" event={"ID":"8b36bd6d-4297-4c41-a010-4d3b10e169b2","Type":"ContainerDied","Data":"5227e41706fe22caca004dd43f8425ad2a85e0debd4e3023254df724a1350c26"} Oct 01 13:05:06 crc kubenswrapper[4727]: I1001 13:05:06.038049 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4nf6w"] Oct 01 13:05:06 crc kubenswrapper[4727]: I1001 13:05:06.045779 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4nf6w"] Oct 01 13:05:06 crc kubenswrapper[4727]: I1001 13:05:06.387755 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b20c7bb1-8c02-4415-9439-1d35d550b644" path="/var/lib/kubelet/pods/b20c7bb1-8c02-4415-9439-1d35d550b644/volumes" Oct 01 13:05:06 crc kubenswrapper[4727]: I1001 13:05:06.799973 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp" Oct 01 13:05:06 crc kubenswrapper[4727]: I1001 13:05:06.910266 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b36bd6d-4297-4c41-a010-4d3b10e169b2-inventory\") pod \"8b36bd6d-4297-4c41-a010-4d3b10e169b2\" (UID: \"8b36bd6d-4297-4c41-a010-4d3b10e169b2\") " Oct 01 13:05:06 crc kubenswrapper[4727]: I1001 13:05:06.910674 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b36bd6d-4297-4c41-a010-4d3b10e169b2-ssh-key\") pod \"8b36bd6d-4297-4c41-a010-4d3b10e169b2\" (UID: \"8b36bd6d-4297-4c41-a010-4d3b10e169b2\") " Oct 01 13:05:06 crc kubenswrapper[4727]: I1001 13:05:06.910838 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnf86\" (UniqueName: \"kubernetes.io/projected/8b36bd6d-4297-4c41-a010-4d3b10e169b2-kube-api-access-cnf86\") pod \"8b36bd6d-4297-4c41-a010-4d3b10e169b2\" (UID: \"8b36bd6d-4297-4c41-a010-4d3b10e169b2\") " Oct 01 13:05:06 crc kubenswrapper[4727]: I1001 13:05:06.916326 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b36bd6d-4297-4c41-a010-4d3b10e169b2-kube-api-access-cnf86" (OuterVolumeSpecName: "kube-api-access-cnf86") pod "8b36bd6d-4297-4c41-a010-4d3b10e169b2" (UID: "8b36bd6d-4297-4c41-a010-4d3b10e169b2"). InnerVolumeSpecName "kube-api-access-cnf86". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:05:06 crc kubenswrapper[4727]: I1001 13:05:06.939679 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b36bd6d-4297-4c41-a010-4d3b10e169b2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8b36bd6d-4297-4c41-a010-4d3b10e169b2" (UID: "8b36bd6d-4297-4c41-a010-4d3b10e169b2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:05:06 crc kubenswrapper[4727]: I1001 13:05:06.941317 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b36bd6d-4297-4c41-a010-4d3b10e169b2-inventory" (OuterVolumeSpecName: "inventory") pod "8b36bd6d-4297-4c41-a010-4d3b10e169b2" (UID: "8b36bd6d-4297-4c41-a010-4d3b10e169b2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:05:07 crc kubenswrapper[4727]: I1001 13:05:07.013576 4727 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b36bd6d-4297-4c41-a010-4d3b10e169b2-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:05:07 crc kubenswrapper[4727]: I1001 13:05:07.013609 4727 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b36bd6d-4297-4c41-a010-4d3b10e169b2-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:05:07 crc kubenswrapper[4727]: I1001 13:05:07.013621 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnf86\" (UniqueName: \"kubernetes.io/projected/8b36bd6d-4297-4c41-a010-4d3b10e169b2-kube-api-access-cnf86\") on node \"crc\" DevicePath \"\"" Oct 01 13:05:07 crc kubenswrapper[4727]: I1001 13:05:07.392251 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp" event={"ID":"8b36bd6d-4297-4c41-a010-4d3b10e169b2","Type":"ContainerDied","Data":"30294f9e5103937f784d751be136855c9704eac8fb6896d15e086023802dccbe"} Oct 01 13:05:07 crc kubenswrapper[4727]: I1001 13:05:07.392292 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30294f9e5103937f784d751be136855c9704eac8fb6896d15e086023802dccbe" Oct 01 13:05:07 crc kubenswrapper[4727]: I1001 13:05:07.392345 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp" Oct 01 13:05:07 crc kubenswrapper[4727]: I1001 13:05:07.470481 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-wm46g"] Oct 01 13:05:07 crc kubenswrapper[4727]: E1001 13:05:07.471467 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b36bd6d-4297-4c41-a010-4d3b10e169b2" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:05:07 crc kubenswrapper[4727]: I1001 13:05:07.471499 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b36bd6d-4297-4c41-a010-4d3b10e169b2" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:05:07 crc kubenswrapper[4727]: I1001 13:05:07.471907 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b36bd6d-4297-4c41-a010-4d3b10e169b2" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:05:07 crc kubenswrapper[4727]: I1001 13:05:07.472901 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wm46g" Oct 01 13:05:07 crc kubenswrapper[4727]: I1001 13:05:07.478733 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:05:07 crc kubenswrapper[4727]: I1001 13:05:07.478904 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jcjb6" Oct 01 13:05:07 crc kubenswrapper[4727]: I1001 13:05:07.478952 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:05:07 crc kubenswrapper[4727]: I1001 13:05:07.479024 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:05:07 crc kubenswrapper[4727]: I1001 13:05:07.479461 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-wm46g"] Oct 01 13:05:07 crc kubenswrapper[4727]: I1001 13:05:07.522050 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8hqm\" (UniqueName: \"kubernetes.io/projected/388b066d-a9db-4f3c-a0e1-c03c12aac2df-kube-api-access-t8hqm\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wm46g\" (UID: \"388b066d-a9db-4f3c-a0e1-c03c12aac2df\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wm46g" Oct 01 13:05:07 crc kubenswrapper[4727]: I1001 13:05:07.522163 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/388b066d-a9db-4f3c-a0e1-c03c12aac2df-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wm46g\" (UID: \"388b066d-a9db-4f3c-a0e1-c03c12aac2df\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wm46g" Oct 01 13:05:07 crc kubenswrapper[4727]: I1001 13:05:07.522281 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/388b066d-a9db-4f3c-a0e1-c03c12aac2df-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wm46g\" (UID: \"388b066d-a9db-4f3c-a0e1-c03c12aac2df\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wm46g" Oct 01 13:05:07 crc kubenswrapper[4727]: I1001 13:05:07.624540 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/388b066d-a9db-4f3c-a0e1-c03c12aac2df-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wm46g\" (UID: \"388b066d-a9db-4f3c-a0e1-c03c12aac2df\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wm46g" Oct 01 13:05:07 crc kubenswrapper[4727]: I1001 13:05:07.624697 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/388b066d-a9db-4f3c-a0e1-c03c12aac2df-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wm46g\" (UID: \"388b066d-a9db-4f3c-a0e1-c03c12aac2df\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wm46g" Oct 01 13:05:07 crc kubenswrapper[4727]: I1001 13:05:07.624770 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8hqm\" (UniqueName: \"kubernetes.io/projected/388b066d-a9db-4f3c-a0e1-c03c12aac2df-kube-api-access-t8hqm\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wm46g\" (UID: \"388b066d-a9db-4f3c-a0e1-c03c12aac2df\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wm46g" Oct 01 13:05:07 crc kubenswrapper[4727]: I1001 13:05:07.630710 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/388b066d-a9db-4f3c-a0e1-c03c12aac2df-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wm46g\" (UID: \"388b066d-a9db-4f3c-a0e1-c03c12aac2df\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wm46g" Oct 01 13:05:07 crc kubenswrapper[4727]: I1001 13:05:07.633302 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/388b066d-a9db-4f3c-a0e1-c03c12aac2df-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wm46g\" (UID: \"388b066d-a9db-4f3c-a0e1-c03c12aac2df\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wm46g" Oct 01 13:05:07 crc kubenswrapper[4727]: I1001 13:05:07.650419 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8hqm\" (UniqueName: \"kubernetes.io/projected/388b066d-a9db-4f3c-a0e1-c03c12aac2df-kube-api-access-t8hqm\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wm46g\" (UID: \"388b066d-a9db-4f3c-a0e1-c03c12aac2df\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wm46g" Oct 01 13:05:07 crc kubenswrapper[4727]: I1001 13:05:07.793417 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wm46g" Oct 01 13:05:08 crc kubenswrapper[4727]: I1001 13:05:08.296584 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-wm46g"] Oct 01 13:05:08 crc kubenswrapper[4727]: I1001 13:05:08.407742 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wm46g" event={"ID":"388b066d-a9db-4f3c-a0e1-c03c12aac2df","Type":"ContainerStarted","Data":"73779b34488130f95730933a8250b0c37bc599ea8ad2c98c91d12e8227a476ed"} Oct 01 13:05:09 crc kubenswrapper[4727]: I1001 13:05:09.416814 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wm46g" event={"ID":"388b066d-a9db-4f3c-a0e1-c03c12aac2df","Type":"ContainerStarted","Data":"aae27b38e76202a54ba8542efae155772d0cbb39cc33ab877300def72dda3e5f"} Oct 01 13:05:09 crc kubenswrapper[4727]: I1001 13:05:09.440236 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wm46g" podStartSLOduration=1.926607364 podStartE2EDuration="2.440215207s" podCreationTimestamp="2025-10-01 13:05:07 +0000 UTC" firstStartedPulling="2025-10-01 13:05:08.304125248 +0000 UTC m=+1686.625480085" lastFinishedPulling="2025-10-01 13:05:08.817733091 +0000 UTC m=+1687.139087928" observedRunningTime="2025-10-01 13:05:09.434667581 +0000 UTC m=+1687.756022428" watchObservedRunningTime="2025-10-01 13:05:09.440215207 +0000 UTC m=+1687.761570044" Oct 01 13:05:10 crc kubenswrapper[4727]: I1001 13:05:10.880594 4727 scope.go:117] "RemoveContainer" containerID="ab9ca6366baa24e6fa0c5c395427786e7df924e70ac81bb98e42eeff5dac5d9c" Oct 01 13:05:10 crc kubenswrapper[4727]: I1001 13:05:10.907359 4727 scope.go:117] "RemoveContainer" containerID="4b46339c373e1e26afad82a49874f20909c298d40daa758f5a3425ed8d2f36cd" Oct 01 13:05:10 crc kubenswrapper[4727]: I1001 13:05:10.956783 4727 scope.go:117] "RemoveContainer" containerID="d8ae7785c1f1ddeaba7fe147dec8eae40ca763c569b1d9fdc919bcde0ab4aed9" Oct 01 13:05:11 crc kubenswrapper[4727]: I1001 13:05:11.006562 4727 scope.go:117] "RemoveContainer" containerID="b873a6cc63eb4f0f842a93ec4f4daf58ccf4dd4beb19ee861959f46446b44f5d" Oct 01 13:05:11 crc kubenswrapper[4727]: I1001 13:05:11.046199 4727 scope.go:117] "RemoveContainer" containerID="717bf5651651b20988f784b3348033a06b65fa45e9893624d920f7e18d394d4a" Oct 01 13:05:11 crc kubenswrapper[4727]: I1001 13:05:11.105635 4727 scope.go:117] "RemoveContainer" containerID="e09eda7289627cf45bd768c441c1c47cf62ff7b48e9a43d0bd05645c67d25608" Oct 01 13:05:11 crc kubenswrapper[4727]: I1001 13:05:11.126535 4727 scope.go:117] "RemoveContainer" containerID="7290d19e504390303179c1a68cd83f65b5f2b38c68624f601ddb3c777e4639d1" Oct 01 13:05:13 crc kubenswrapper[4727]: I1001 13:05:13.372571 4727 scope.go:117] "RemoveContainer" containerID="34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" Oct 01 13:05:13 crc kubenswrapper[4727]: E1001 13:05:13.373173 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:05:28 crc kubenswrapper[4727]: I1001 13:05:28.038028 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-wr7vr"] Oct 01 13:05:28 crc kubenswrapper[4727]: I1001 13:05:28.046839 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-wr7vr"] Oct 01 13:05:28 crc kubenswrapper[4727]: I1001 13:05:28.372329 4727 scope.go:117] "RemoveContainer" containerID="34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" Oct 01 13:05:28 crc kubenswrapper[4727]: E1001 13:05:28.372604 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:05:28 crc kubenswrapper[4727]: I1001 13:05:28.385760 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58437054-5624-488d-a672-2eb046c0d09c" path="/var/lib/kubelet/pods/58437054-5624-488d-a672-2eb046c0d09c/volumes" Oct 01 13:05:29 crc kubenswrapper[4727]: I1001 13:05:29.032553 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kfqzs"] Oct 01 13:05:29 crc kubenswrapper[4727]: I1001 13:05:29.043463 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-kfqzs"] Oct 01 13:05:30 crc kubenswrapper[4727]: I1001 13:05:30.383362 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccd566f7-b3ff-4de8-8ec9-8c080005d70a" path="/var/lib/kubelet/pods/ccd566f7-b3ff-4de8-8ec9-8c080005d70a/volumes" Oct 01 13:05:41 crc kubenswrapper[4727]: I1001 13:05:41.373247 4727 scope.go:117] "RemoveContainer" containerID="34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" Oct 01 13:05:41 crc kubenswrapper[4727]: E1001 13:05:41.374140 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:05:44 crc kubenswrapper[4727]: I1001 13:05:44.721781 4727 generic.go:334] "Generic (PLEG): container finished" podID="388b066d-a9db-4f3c-a0e1-c03c12aac2df" containerID="aae27b38e76202a54ba8542efae155772d0cbb39cc33ab877300def72dda3e5f" exitCode=0 Oct 01 13:05:44 crc kubenswrapper[4727]: I1001 13:05:44.721869 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wm46g" event={"ID":"388b066d-a9db-4f3c-a0e1-c03c12aac2df","Type":"ContainerDied","Data":"aae27b38e76202a54ba8542efae155772d0cbb39cc33ab877300def72dda3e5f"} Oct 01 13:05:46 crc kubenswrapper[4727]: I1001 13:05:46.197369 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wm46g" Oct 01 13:05:46 crc kubenswrapper[4727]: I1001 13:05:46.234093 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/388b066d-a9db-4f3c-a0e1-c03c12aac2df-inventory\") pod \"388b066d-a9db-4f3c-a0e1-c03c12aac2df\" (UID: \"388b066d-a9db-4f3c-a0e1-c03c12aac2df\") " Oct 01 13:05:46 crc kubenswrapper[4727]: I1001 13:05:46.234326 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/388b066d-a9db-4f3c-a0e1-c03c12aac2df-ssh-key\") pod \"388b066d-a9db-4f3c-a0e1-c03c12aac2df\" (UID: \"388b066d-a9db-4f3c-a0e1-c03c12aac2df\") " Oct 01 13:05:46 crc kubenswrapper[4727]: I1001 13:05:46.234604 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8hqm\" (UniqueName: \"kubernetes.io/projected/388b066d-a9db-4f3c-a0e1-c03c12aac2df-kube-api-access-t8hqm\") pod \"388b066d-a9db-4f3c-a0e1-c03c12aac2df\" (UID: \"388b066d-a9db-4f3c-a0e1-c03c12aac2df\") " Oct 01 13:05:46 crc kubenswrapper[4727]: I1001 13:05:46.243375 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/388b066d-a9db-4f3c-a0e1-c03c12aac2df-kube-api-access-t8hqm" (OuterVolumeSpecName: "kube-api-access-t8hqm") pod "388b066d-a9db-4f3c-a0e1-c03c12aac2df" (UID: "388b066d-a9db-4f3c-a0e1-c03c12aac2df"). InnerVolumeSpecName "kube-api-access-t8hqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:05:46 crc kubenswrapper[4727]: I1001 13:05:46.272228 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/388b066d-a9db-4f3c-a0e1-c03c12aac2df-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "388b066d-a9db-4f3c-a0e1-c03c12aac2df" (UID: "388b066d-a9db-4f3c-a0e1-c03c12aac2df"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:05:46 crc kubenswrapper[4727]: I1001 13:05:46.273595 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/388b066d-a9db-4f3c-a0e1-c03c12aac2df-inventory" (OuterVolumeSpecName: "inventory") pod "388b066d-a9db-4f3c-a0e1-c03c12aac2df" (UID: "388b066d-a9db-4f3c-a0e1-c03c12aac2df"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:05:46 crc kubenswrapper[4727]: I1001 13:05:46.337012 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8hqm\" (UniqueName: \"kubernetes.io/projected/388b066d-a9db-4f3c-a0e1-c03c12aac2df-kube-api-access-t8hqm\") on node \"crc\" DevicePath \"\"" Oct 01 13:05:46 crc kubenswrapper[4727]: I1001 13:05:46.337082 4727 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/388b066d-a9db-4f3c-a0e1-c03c12aac2df-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:05:46 crc kubenswrapper[4727]: I1001 13:05:46.337093 4727 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/388b066d-a9db-4f3c-a0e1-c03c12aac2df-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:05:46 crc kubenswrapper[4727]: I1001 13:05:46.749146 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wm46g" event={"ID":"388b066d-a9db-4f3c-a0e1-c03c12aac2df","Type":"ContainerDied","Data":"73779b34488130f95730933a8250b0c37bc599ea8ad2c98c91d12e8227a476ed"} Oct 01 13:05:46 crc kubenswrapper[4727]: I1001 13:05:46.749190 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73779b34488130f95730933a8250b0c37bc599ea8ad2c98c91d12e8227a476ed" Oct 01 13:05:46 crc kubenswrapper[4727]: I1001 13:05:46.749249 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wm46g" Oct 01 13:05:46 crc kubenswrapper[4727]: I1001 13:05:46.841338 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz"] Oct 01 13:05:46 crc kubenswrapper[4727]: E1001 13:05:46.841826 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="388b066d-a9db-4f3c-a0e1-c03c12aac2df" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:05:46 crc kubenswrapper[4727]: I1001 13:05:46.841852 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="388b066d-a9db-4f3c-a0e1-c03c12aac2df" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:05:46 crc kubenswrapper[4727]: I1001 13:05:46.842142 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="388b066d-a9db-4f3c-a0e1-c03c12aac2df" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:05:46 crc kubenswrapper[4727]: I1001 13:05:46.842961 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz" Oct 01 13:05:46 crc kubenswrapper[4727]: I1001 13:05:46.845889 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:05:46 crc kubenswrapper[4727]: I1001 13:05:46.846160 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:05:46 crc kubenswrapper[4727]: I1001 13:05:46.846494 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jcjb6" Oct 01 13:05:46 crc kubenswrapper[4727]: I1001 13:05:46.847413 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:05:46 crc kubenswrapper[4727]: I1001 13:05:46.893870 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz"] Oct 01 13:05:46 crc kubenswrapper[4727]: I1001 13:05:46.947280 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a70d26d-18ff-4550-b1eb-a720a38162ee-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz\" (UID: \"5a70d26d-18ff-4550-b1eb-a720a38162ee\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz" Oct 01 13:05:46 crc kubenswrapper[4727]: I1001 13:05:46.947369 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a70d26d-18ff-4550-b1eb-a720a38162ee-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz\" (UID: \"5a70d26d-18ff-4550-b1eb-a720a38162ee\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz" Oct 01 13:05:46 crc kubenswrapper[4727]: I1001 13:05:46.947405 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnvmk\" (UniqueName: \"kubernetes.io/projected/5a70d26d-18ff-4550-b1eb-a720a38162ee-kube-api-access-pnvmk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz\" (UID: \"5a70d26d-18ff-4550-b1eb-a720a38162ee\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz" Oct 01 13:05:47 crc kubenswrapper[4727]: I1001 13:05:47.049276 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a70d26d-18ff-4550-b1eb-a720a38162ee-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz\" (UID: \"5a70d26d-18ff-4550-b1eb-a720a38162ee\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz" Oct 01 13:05:47 crc kubenswrapper[4727]: I1001 13:05:47.049641 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnvmk\" (UniqueName: \"kubernetes.io/projected/5a70d26d-18ff-4550-b1eb-a720a38162ee-kube-api-access-pnvmk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz\" (UID: \"5a70d26d-18ff-4550-b1eb-a720a38162ee\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz" Oct 01 13:05:47 crc kubenswrapper[4727]: I1001 13:05:47.049810 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a70d26d-18ff-4550-b1eb-a720a38162ee-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz\" (UID: \"5a70d26d-18ff-4550-b1eb-a720a38162ee\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz" Oct 01 13:05:47 crc kubenswrapper[4727]: I1001 13:05:47.054451 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a70d26d-18ff-4550-b1eb-a720a38162ee-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz\" (UID: \"5a70d26d-18ff-4550-b1eb-a720a38162ee\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz" Oct 01 13:05:47 crc kubenswrapper[4727]: I1001 13:05:47.067689 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a70d26d-18ff-4550-b1eb-a720a38162ee-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz\" (UID: \"5a70d26d-18ff-4550-b1eb-a720a38162ee\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz" Oct 01 13:05:47 crc kubenswrapper[4727]: I1001 13:05:47.068653 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnvmk\" (UniqueName: \"kubernetes.io/projected/5a70d26d-18ff-4550-b1eb-a720a38162ee-kube-api-access-pnvmk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz\" (UID: \"5a70d26d-18ff-4550-b1eb-a720a38162ee\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz" Oct 01 13:05:47 crc kubenswrapper[4727]: I1001 13:05:47.165126 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz" Oct 01 13:05:47 crc kubenswrapper[4727]: I1001 13:05:47.690577 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz"] Oct 01 13:05:47 crc kubenswrapper[4727]: I1001 13:05:47.761047 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz" event={"ID":"5a70d26d-18ff-4550-b1eb-a720a38162ee","Type":"ContainerStarted","Data":"51eb513fa119df9fc2a8a1f0932d97b8b51cecdfed5da36a2c845b9f7f1936cc"} Oct 01 13:05:48 crc kubenswrapper[4727]: I1001 13:05:48.770706 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz" event={"ID":"5a70d26d-18ff-4550-b1eb-a720a38162ee","Type":"ContainerStarted","Data":"0c9e5f07098daba64b9822d7785384b412e0d3b3fd28289759c4e6ed327f0bc7"} Oct 01 13:05:48 crc kubenswrapper[4727]: I1001 13:05:48.794160 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz" podStartSLOduration=2.233631535 podStartE2EDuration="2.79412986s" podCreationTimestamp="2025-10-01 13:05:46 +0000 UTC" firstStartedPulling="2025-10-01 13:05:47.697326793 +0000 UTC m=+1726.018681630" lastFinishedPulling="2025-10-01 13:05:48.257825128 +0000 UTC m=+1726.579179955" observedRunningTime="2025-10-01 13:05:48.785654392 +0000 UTC m=+1727.107009239" watchObservedRunningTime="2025-10-01 13:05:48.79412986 +0000 UTC m=+1727.115484707" Oct 01 13:05:53 crc kubenswrapper[4727]: I1001 13:05:53.372287 4727 scope.go:117] "RemoveContainer" containerID="34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" Oct 01 13:05:53 crc kubenswrapper[4727]: E1001 13:05:53.373240 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:06:04 crc kubenswrapper[4727]: I1001 13:06:04.372842 4727 scope.go:117] "RemoveContainer" containerID="34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" Oct 01 13:06:04 crc kubenswrapper[4727]: E1001 13:06:04.373681 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:06:11 crc kubenswrapper[4727]: I1001 13:06:11.293720 4727 scope.go:117] "RemoveContainer" containerID="7507da85d2ce54f81bb9b454e600e5cb7a114181bc5d055bc2b141684875eb8c" Oct 01 13:06:11 crc kubenswrapper[4727]: I1001 13:06:11.339779 4727 scope.go:117] "RemoveContainer" containerID="76e3eea5716d1ca4c8b469c7d39bfa91104b0e74884abd01413900c8f4e64805" Oct 01 13:06:14 crc kubenswrapper[4727]: I1001 13:06:14.084896 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-rpkr4"] Oct 01 13:06:14 crc kubenswrapper[4727]: I1001 13:06:14.097073 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-rpkr4"] Oct 01 13:06:14 crc kubenswrapper[4727]: I1001 13:06:14.386529 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc24fafb-1942-435d-8dd4-412ef1b4ebd6" path="/var/lib/kubelet/pods/cc24fafb-1942-435d-8dd4-412ef1b4ebd6/volumes" Oct 01 13:06:16 crc kubenswrapper[4727]: I1001 13:06:16.372370 4727 scope.go:117] "RemoveContainer" containerID="34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" Oct 01 13:06:16 crc kubenswrapper[4727]: E1001 13:06:16.372945 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:06:30 crc kubenswrapper[4727]: I1001 13:06:30.372065 4727 scope.go:117] "RemoveContainer" containerID="34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" Oct 01 13:06:30 crc kubenswrapper[4727]: E1001 13:06:30.372843 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:06:43 crc kubenswrapper[4727]: I1001 13:06:43.248250 4727 generic.go:334] "Generic (PLEG): container finished" podID="5a70d26d-18ff-4550-b1eb-a720a38162ee" containerID="0c9e5f07098daba64b9822d7785384b412e0d3b3fd28289759c4e6ed327f0bc7" exitCode=2 Oct 01 13:06:43 crc kubenswrapper[4727]: I1001 13:06:43.248361 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz" event={"ID":"5a70d26d-18ff-4550-b1eb-a720a38162ee","Type":"ContainerDied","Data":"0c9e5f07098daba64b9822d7785384b412e0d3b3fd28289759c4e6ed327f0bc7"} Oct 01 13:06:44 crc kubenswrapper[4727]: I1001 13:06:44.640701 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz" Oct 01 13:06:44 crc kubenswrapper[4727]: I1001 13:06:44.788906 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a70d26d-18ff-4550-b1eb-a720a38162ee-inventory\") pod \"5a70d26d-18ff-4550-b1eb-a720a38162ee\" (UID: \"5a70d26d-18ff-4550-b1eb-a720a38162ee\") " Oct 01 13:06:44 crc kubenswrapper[4727]: I1001 13:06:44.789081 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnvmk\" (UniqueName: \"kubernetes.io/projected/5a70d26d-18ff-4550-b1eb-a720a38162ee-kube-api-access-pnvmk\") pod \"5a70d26d-18ff-4550-b1eb-a720a38162ee\" (UID: \"5a70d26d-18ff-4550-b1eb-a720a38162ee\") " Oct 01 13:06:44 crc kubenswrapper[4727]: I1001 13:06:44.789258 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a70d26d-18ff-4550-b1eb-a720a38162ee-ssh-key\") pod \"5a70d26d-18ff-4550-b1eb-a720a38162ee\" (UID: \"5a70d26d-18ff-4550-b1eb-a720a38162ee\") " Oct 01 13:06:44 crc kubenswrapper[4727]: I1001 13:06:44.796417 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a70d26d-18ff-4550-b1eb-a720a38162ee-kube-api-access-pnvmk" (OuterVolumeSpecName: "kube-api-access-pnvmk") pod "5a70d26d-18ff-4550-b1eb-a720a38162ee" (UID: "5a70d26d-18ff-4550-b1eb-a720a38162ee"). InnerVolumeSpecName "kube-api-access-pnvmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:44 crc kubenswrapper[4727]: I1001 13:06:44.820730 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a70d26d-18ff-4550-b1eb-a720a38162ee-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5a70d26d-18ff-4550-b1eb-a720a38162ee" (UID: "5a70d26d-18ff-4550-b1eb-a720a38162ee"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:44 crc kubenswrapper[4727]: I1001 13:06:44.826301 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a70d26d-18ff-4550-b1eb-a720a38162ee-inventory" (OuterVolumeSpecName: "inventory") pod "5a70d26d-18ff-4550-b1eb-a720a38162ee" (UID: "5a70d26d-18ff-4550-b1eb-a720a38162ee"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:44 crc kubenswrapper[4727]: I1001 13:06:44.892827 4727 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a70d26d-18ff-4550-b1eb-a720a38162ee-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:44 crc kubenswrapper[4727]: I1001 13:06:44.892868 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnvmk\" (UniqueName: \"kubernetes.io/projected/5a70d26d-18ff-4550-b1eb-a720a38162ee-kube-api-access-pnvmk\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:44 crc kubenswrapper[4727]: I1001 13:06:44.892879 4727 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5a70d26d-18ff-4550-b1eb-a720a38162ee-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:45 crc kubenswrapper[4727]: I1001 13:06:45.264770 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz" event={"ID":"5a70d26d-18ff-4550-b1eb-a720a38162ee","Type":"ContainerDied","Data":"51eb513fa119df9fc2a8a1f0932d97b8b51cecdfed5da36a2c845b9f7f1936cc"} Oct 01 13:06:45 crc kubenswrapper[4727]: I1001 13:06:45.264812 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51eb513fa119df9fc2a8a1f0932d97b8b51cecdfed5da36a2c845b9f7f1936cc" Oct 01 13:06:45 crc kubenswrapper[4727]: I1001 13:06:45.264871 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz" Oct 01 13:06:45 crc kubenswrapper[4727]: I1001 13:06:45.372387 4727 scope.go:117] "RemoveContainer" containerID="34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" Oct 01 13:06:45 crc kubenswrapper[4727]: E1001 13:06:45.372846 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:06:52 crc kubenswrapper[4727]: I1001 13:06:52.071704 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j"] Oct 01 13:06:52 crc kubenswrapper[4727]: E1001 13:06:52.072825 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a70d26d-18ff-4550-b1eb-a720a38162ee" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:06:52 crc kubenswrapper[4727]: I1001 13:06:52.072845 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a70d26d-18ff-4550-b1eb-a720a38162ee" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:06:52 crc kubenswrapper[4727]: I1001 13:06:52.073131 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a70d26d-18ff-4550-b1eb-a720a38162ee" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:06:52 crc kubenswrapper[4727]: I1001 13:06:52.074171 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j" Oct 01 13:06:52 crc kubenswrapper[4727]: I1001 13:06:52.078140 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:06:52 crc kubenswrapper[4727]: I1001 13:06:52.078232 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:06:52 crc kubenswrapper[4727]: I1001 13:06:52.078358 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:06:52 crc kubenswrapper[4727]: I1001 13:06:52.078172 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jcjb6" Oct 01 13:06:52 crc kubenswrapper[4727]: I1001 13:06:52.081621 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j"] Oct 01 13:06:52 crc kubenswrapper[4727]: I1001 13:06:52.142942 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skkf6\" (UniqueName: \"kubernetes.io/projected/0e5dff42-2d16-4c83-adc0-bdad8d122cc3-kube-api-access-skkf6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j\" (UID: \"0e5dff42-2d16-4c83-adc0-bdad8d122cc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j" Oct 01 13:06:52 crc kubenswrapper[4727]: I1001 13:06:52.143192 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0e5dff42-2d16-4c83-adc0-bdad8d122cc3-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j\" (UID: \"0e5dff42-2d16-4c83-adc0-bdad8d122cc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j" Oct 01 13:06:52 crc kubenswrapper[4727]: I1001 13:06:52.143224 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e5dff42-2d16-4c83-adc0-bdad8d122cc3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j\" (UID: \"0e5dff42-2d16-4c83-adc0-bdad8d122cc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j" Oct 01 13:06:52 crc kubenswrapper[4727]: I1001 13:06:52.245338 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0e5dff42-2d16-4c83-adc0-bdad8d122cc3-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j\" (UID: \"0e5dff42-2d16-4c83-adc0-bdad8d122cc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j" Oct 01 13:06:52 crc kubenswrapper[4727]: I1001 13:06:52.245390 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e5dff42-2d16-4c83-adc0-bdad8d122cc3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j\" (UID: \"0e5dff42-2d16-4c83-adc0-bdad8d122cc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j" Oct 01 13:06:52 crc kubenswrapper[4727]: I1001 13:06:52.245513 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skkf6\" (UniqueName: \"kubernetes.io/projected/0e5dff42-2d16-4c83-adc0-bdad8d122cc3-kube-api-access-skkf6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j\" (UID: \"0e5dff42-2d16-4c83-adc0-bdad8d122cc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j" Oct 01 13:06:52 crc kubenswrapper[4727]: I1001 13:06:52.251600 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e5dff42-2d16-4c83-adc0-bdad8d122cc3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j\" (UID: \"0e5dff42-2d16-4c83-adc0-bdad8d122cc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j" Oct 01 13:06:52 crc kubenswrapper[4727]: I1001 13:06:52.255251 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0e5dff42-2d16-4c83-adc0-bdad8d122cc3-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j\" (UID: \"0e5dff42-2d16-4c83-adc0-bdad8d122cc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j" Oct 01 13:06:52 crc kubenswrapper[4727]: I1001 13:06:52.262300 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skkf6\" (UniqueName: \"kubernetes.io/projected/0e5dff42-2d16-4c83-adc0-bdad8d122cc3-kube-api-access-skkf6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j\" (UID: \"0e5dff42-2d16-4c83-adc0-bdad8d122cc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j" Oct 01 13:06:52 crc kubenswrapper[4727]: I1001 13:06:52.400837 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j" Oct 01 13:06:52 crc kubenswrapper[4727]: I1001 13:06:52.902487 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j"] Oct 01 13:06:53 crc kubenswrapper[4727]: I1001 13:06:53.333446 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j" event={"ID":"0e5dff42-2d16-4c83-adc0-bdad8d122cc3","Type":"ContainerStarted","Data":"8866e9cb20236c7d12ac8558124ba2326dcf13f23edbd572736fcbf8ef73489c"} Oct 01 13:06:54 crc kubenswrapper[4727]: I1001 13:06:54.344559 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j" event={"ID":"0e5dff42-2d16-4c83-adc0-bdad8d122cc3","Type":"ContainerStarted","Data":"d13d079bfbdef8c50981d6f8e7be3e54a01097f5101a3284fdea2cc7e908ee11"} Oct 01 13:06:59 crc kubenswrapper[4727]: I1001 13:06:59.373216 4727 scope.go:117] "RemoveContainer" containerID="34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" Oct 01 13:06:59 crc kubenswrapper[4727]: E1001 13:06:59.377095 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:07:10 crc kubenswrapper[4727]: I1001 13:07:10.372348 4727 scope.go:117] "RemoveContainer" containerID="34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" Oct 01 13:07:10 crc kubenswrapper[4727]: E1001 13:07:10.373083 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:07:11 crc kubenswrapper[4727]: I1001 13:07:11.478648 4727 scope.go:117] "RemoveContainer" containerID="90908d5469a3816efabd59e45d236920f8dcdb8ba2084930667fcc11ac99b0b7" Oct 01 13:07:24 crc kubenswrapper[4727]: I1001 13:07:24.372301 4727 scope.go:117] "RemoveContainer" containerID="34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" Oct 01 13:07:24 crc kubenswrapper[4727]: E1001 13:07:24.390863 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:07:38 crc kubenswrapper[4727]: I1001 13:07:38.372921 4727 scope.go:117] "RemoveContainer" containerID="34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" Oct 01 13:07:38 crc kubenswrapper[4727]: E1001 13:07:38.373670 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:07:39 crc kubenswrapper[4727]: I1001 13:07:39.806143 4727 generic.go:334] "Generic (PLEG): container finished" podID="0e5dff42-2d16-4c83-adc0-bdad8d122cc3" containerID="d13d079bfbdef8c50981d6f8e7be3e54a01097f5101a3284fdea2cc7e908ee11" exitCode=0 Oct 01 13:07:39 crc kubenswrapper[4727]: I1001 13:07:39.806251 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j" event={"ID":"0e5dff42-2d16-4c83-adc0-bdad8d122cc3","Type":"ContainerDied","Data":"d13d079bfbdef8c50981d6f8e7be3e54a01097f5101a3284fdea2cc7e908ee11"} Oct 01 13:07:41 crc kubenswrapper[4727]: I1001 13:07:41.219504 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j" Oct 01 13:07:41 crc kubenswrapper[4727]: I1001 13:07:41.309835 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skkf6\" (UniqueName: \"kubernetes.io/projected/0e5dff42-2d16-4c83-adc0-bdad8d122cc3-kube-api-access-skkf6\") pod \"0e5dff42-2d16-4c83-adc0-bdad8d122cc3\" (UID: \"0e5dff42-2d16-4c83-adc0-bdad8d122cc3\") " Oct 01 13:07:41 crc kubenswrapper[4727]: I1001 13:07:41.310093 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e5dff42-2d16-4c83-adc0-bdad8d122cc3-inventory\") pod \"0e5dff42-2d16-4c83-adc0-bdad8d122cc3\" (UID: \"0e5dff42-2d16-4c83-adc0-bdad8d122cc3\") " Oct 01 13:07:41 crc kubenswrapper[4727]: I1001 13:07:41.310150 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0e5dff42-2d16-4c83-adc0-bdad8d122cc3-ssh-key\") pod \"0e5dff42-2d16-4c83-adc0-bdad8d122cc3\" (UID: \"0e5dff42-2d16-4c83-adc0-bdad8d122cc3\") " Oct 01 13:07:41 crc kubenswrapper[4727]: I1001 13:07:41.318922 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e5dff42-2d16-4c83-adc0-bdad8d122cc3-kube-api-access-skkf6" (OuterVolumeSpecName: "kube-api-access-skkf6") pod "0e5dff42-2d16-4c83-adc0-bdad8d122cc3" (UID: "0e5dff42-2d16-4c83-adc0-bdad8d122cc3"). InnerVolumeSpecName "kube-api-access-skkf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:07:41 crc kubenswrapper[4727]: I1001 13:07:41.340355 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e5dff42-2d16-4c83-adc0-bdad8d122cc3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0e5dff42-2d16-4c83-adc0-bdad8d122cc3" (UID: "0e5dff42-2d16-4c83-adc0-bdad8d122cc3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:07:41 crc kubenswrapper[4727]: I1001 13:07:41.361478 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e5dff42-2d16-4c83-adc0-bdad8d122cc3-inventory" (OuterVolumeSpecName: "inventory") pod "0e5dff42-2d16-4c83-adc0-bdad8d122cc3" (UID: "0e5dff42-2d16-4c83-adc0-bdad8d122cc3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:07:41 crc kubenswrapper[4727]: I1001 13:07:41.412828 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skkf6\" (UniqueName: \"kubernetes.io/projected/0e5dff42-2d16-4c83-adc0-bdad8d122cc3-kube-api-access-skkf6\") on node \"crc\" DevicePath \"\"" Oct 01 13:07:41 crc kubenswrapper[4727]: I1001 13:07:41.412864 4727 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e5dff42-2d16-4c83-adc0-bdad8d122cc3-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:07:41 crc kubenswrapper[4727]: I1001 13:07:41.412873 4727 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0e5dff42-2d16-4c83-adc0-bdad8d122cc3-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:07:41 crc kubenswrapper[4727]: I1001 13:07:41.825683 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j" event={"ID":"0e5dff42-2d16-4c83-adc0-bdad8d122cc3","Type":"ContainerDied","Data":"8866e9cb20236c7d12ac8558124ba2326dcf13f23edbd572736fcbf8ef73489c"} Oct 01 13:07:41 crc kubenswrapper[4727]: I1001 13:07:41.825732 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8866e9cb20236c7d12ac8558124ba2326dcf13f23edbd572736fcbf8ef73489c" Oct 01 13:07:41 crc kubenswrapper[4727]: I1001 13:07:41.825764 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j" Oct 01 13:07:41 crc kubenswrapper[4727]: I1001 13:07:41.915090 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ksfcf"] Oct 01 13:07:41 crc kubenswrapper[4727]: E1001 13:07:41.915898 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e5dff42-2d16-4c83-adc0-bdad8d122cc3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:07:41 crc kubenswrapper[4727]: I1001 13:07:41.915919 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e5dff42-2d16-4c83-adc0-bdad8d122cc3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:07:41 crc kubenswrapper[4727]: I1001 13:07:41.916706 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e5dff42-2d16-4c83-adc0-bdad8d122cc3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:07:41 crc kubenswrapper[4727]: I1001 13:07:41.919482 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ksfcf" Oct 01 13:07:41 crc kubenswrapper[4727]: I1001 13:07:41.924711 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:07:41 crc kubenswrapper[4727]: I1001 13:07:41.925068 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jcjb6" Oct 01 13:07:41 crc kubenswrapper[4727]: I1001 13:07:41.925127 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:07:41 crc kubenswrapper[4727]: I1001 13:07:41.925165 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:07:41 crc kubenswrapper[4727]: I1001 13:07:41.930335 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ksfcf"] Oct 01 13:07:42 crc kubenswrapper[4727]: I1001 13:07:42.025519 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/6ee1dacb-0b88-4de8-aa88-a24d1494ed94-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-ksfcf\" (UID: \"6ee1dacb-0b88-4de8-aa88-a24d1494ed94\") " pod="openstack/ssh-known-hosts-edpm-deployment-ksfcf" Oct 01 13:07:42 crc kubenswrapper[4727]: I1001 13:07:42.025593 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkz2r\" (UniqueName: \"kubernetes.io/projected/6ee1dacb-0b88-4de8-aa88-a24d1494ed94-kube-api-access-jkz2r\") pod \"ssh-known-hosts-edpm-deployment-ksfcf\" (UID: \"6ee1dacb-0b88-4de8-aa88-a24d1494ed94\") " pod="openstack/ssh-known-hosts-edpm-deployment-ksfcf" Oct 01 13:07:42 crc kubenswrapper[4727]: I1001 13:07:42.025816 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ee1dacb-0b88-4de8-aa88-a24d1494ed94-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-ksfcf\" (UID: \"6ee1dacb-0b88-4de8-aa88-a24d1494ed94\") " pod="openstack/ssh-known-hosts-edpm-deployment-ksfcf" Oct 01 13:07:42 crc kubenswrapper[4727]: I1001 13:07:42.127318 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/6ee1dacb-0b88-4de8-aa88-a24d1494ed94-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-ksfcf\" (UID: \"6ee1dacb-0b88-4de8-aa88-a24d1494ed94\") " pod="openstack/ssh-known-hosts-edpm-deployment-ksfcf" Oct 01 13:07:42 crc kubenswrapper[4727]: I1001 13:07:42.127690 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkz2r\" (UniqueName: \"kubernetes.io/projected/6ee1dacb-0b88-4de8-aa88-a24d1494ed94-kube-api-access-jkz2r\") pod \"ssh-known-hosts-edpm-deployment-ksfcf\" (UID: \"6ee1dacb-0b88-4de8-aa88-a24d1494ed94\") " pod="openstack/ssh-known-hosts-edpm-deployment-ksfcf" Oct 01 13:07:42 crc kubenswrapper[4727]: I1001 13:07:42.127844 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ee1dacb-0b88-4de8-aa88-a24d1494ed94-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-ksfcf\" (UID: \"6ee1dacb-0b88-4de8-aa88-a24d1494ed94\") " pod="openstack/ssh-known-hosts-edpm-deployment-ksfcf" Oct 01 13:07:42 crc kubenswrapper[4727]: I1001 13:07:42.131072 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ee1dacb-0b88-4de8-aa88-a24d1494ed94-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-ksfcf\" (UID: \"6ee1dacb-0b88-4de8-aa88-a24d1494ed94\") " pod="openstack/ssh-known-hosts-edpm-deployment-ksfcf" Oct 01 13:07:42 crc kubenswrapper[4727]: I1001 13:07:42.134912 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/6ee1dacb-0b88-4de8-aa88-a24d1494ed94-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-ksfcf\" (UID: \"6ee1dacb-0b88-4de8-aa88-a24d1494ed94\") " pod="openstack/ssh-known-hosts-edpm-deployment-ksfcf" Oct 01 13:07:42 crc kubenswrapper[4727]: I1001 13:07:42.143413 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkz2r\" (UniqueName: \"kubernetes.io/projected/6ee1dacb-0b88-4de8-aa88-a24d1494ed94-kube-api-access-jkz2r\") pod \"ssh-known-hosts-edpm-deployment-ksfcf\" (UID: \"6ee1dacb-0b88-4de8-aa88-a24d1494ed94\") " pod="openstack/ssh-known-hosts-edpm-deployment-ksfcf" Oct 01 13:07:42 crc kubenswrapper[4727]: I1001 13:07:42.246372 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ksfcf" Oct 01 13:07:42 crc kubenswrapper[4727]: I1001 13:07:42.735505 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ksfcf"] Oct 01 13:07:42 crc kubenswrapper[4727]: I1001 13:07:42.742763 4727 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:07:42 crc kubenswrapper[4727]: I1001 13:07:42.834203 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ksfcf" event={"ID":"6ee1dacb-0b88-4de8-aa88-a24d1494ed94","Type":"ContainerStarted","Data":"68bbb956f1d4b438287aa844f9edbced214efc8dc556352a1cdb63d95ad53def"} Oct 01 13:07:43 crc kubenswrapper[4727]: I1001 13:07:43.845677 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ksfcf" event={"ID":"6ee1dacb-0b88-4de8-aa88-a24d1494ed94","Type":"ContainerStarted","Data":"5b43cd35453fbb99c079f39812fc0b2cef0dc73de70b547b36e40d01279153ea"} Oct 01 13:07:43 crc kubenswrapper[4727]: I1001 13:07:43.862698 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-ksfcf" podStartSLOduration=2.405646182 podStartE2EDuration="2.862680938s" podCreationTimestamp="2025-10-01 13:07:41 +0000 UTC" firstStartedPulling="2025-10-01 13:07:42.742484621 +0000 UTC m=+1841.063839458" lastFinishedPulling="2025-10-01 13:07:43.199519377 +0000 UTC m=+1841.520874214" observedRunningTime="2025-10-01 13:07:43.860621102 +0000 UTC m=+1842.181975939" watchObservedRunningTime="2025-10-01 13:07:43.862680938 +0000 UTC m=+1842.184035775" Oct 01 13:07:49 crc kubenswrapper[4727]: I1001 13:07:49.373153 4727 scope.go:117] "RemoveContainer" containerID="34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" Oct 01 13:07:49 crc kubenswrapper[4727]: E1001 13:07:49.374471 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:07:50 crc kubenswrapper[4727]: I1001 13:07:50.904766 4727 generic.go:334] "Generic (PLEG): container finished" podID="6ee1dacb-0b88-4de8-aa88-a24d1494ed94" containerID="5b43cd35453fbb99c079f39812fc0b2cef0dc73de70b547b36e40d01279153ea" exitCode=0 Oct 01 13:07:50 crc kubenswrapper[4727]: I1001 13:07:50.905158 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ksfcf" event={"ID":"6ee1dacb-0b88-4de8-aa88-a24d1494ed94","Type":"ContainerDied","Data":"5b43cd35453fbb99c079f39812fc0b2cef0dc73de70b547b36e40d01279153ea"} Oct 01 13:07:52 crc kubenswrapper[4727]: I1001 13:07:52.328560 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ksfcf" Oct 01 13:07:52 crc kubenswrapper[4727]: I1001 13:07:52.424031 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/6ee1dacb-0b88-4de8-aa88-a24d1494ed94-inventory-0\") pod \"6ee1dacb-0b88-4de8-aa88-a24d1494ed94\" (UID: \"6ee1dacb-0b88-4de8-aa88-a24d1494ed94\") " Oct 01 13:07:52 crc kubenswrapper[4727]: I1001 13:07:52.424080 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ee1dacb-0b88-4de8-aa88-a24d1494ed94-ssh-key-openstack-edpm-ipam\") pod \"6ee1dacb-0b88-4de8-aa88-a24d1494ed94\" (UID: \"6ee1dacb-0b88-4de8-aa88-a24d1494ed94\") " Oct 01 13:07:52 crc kubenswrapper[4727]: I1001 13:07:52.424147 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkz2r\" (UniqueName: \"kubernetes.io/projected/6ee1dacb-0b88-4de8-aa88-a24d1494ed94-kube-api-access-jkz2r\") pod \"6ee1dacb-0b88-4de8-aa88-a24d1494ed94\" (UID: \"6ee1dacb-0b88-4de8-aa88-a24d1494ed94\") " Oct 01 13:07:52 crc kubenswrapper[4727]: I1001 13:07:52.437242 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ee1dacb-0b88-4de8-aa88-a24d1494ed94-kube-api-access-jkz2r" (OuterVolumeSpecName: "kube-api-access-jkz2r") pod "6ee1dacb-0b88-4de8-aa88-a24d1494ed94" (UID: "6ee1dacb-0b88-4de8-aa88-a24d1494ed94"). InnerVolumeSpecName "kube-api-access-jkz2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:07:52 crc kubenswrapper[4727]: I1001 13:07:52.452231 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ee1dacb-0b88-4de8-aa88-a24d1494ed94-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6ee1dacb-0b88-4de8-aa88-a24d1494ed94" (UID: "6ee1dacb-0b88-4de8-aa88-a24d1494ed94"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:07:52 crc kubenswrapper[4727]: I1001 13:07:52.457878 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ee1dacb-0b88-4de8-aa88-a24d1494ed94-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "6ee1dacb-0b88-4de8-aa88-a24d1494ed94" (UID: "6ee1dacb-0b88-4de8-aa88-a24d1494ed94"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:07:52 crc kubenswrapper[4727]: I1001 13:07:52.526400 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkz2r\" (UniqueName: \"kubernetes.io/projected/6ee1dacb-0b88-4de8-aa88-a24d1494ed94-kube-api-access-jkz2r\") on node \"crc\" DevicePath \"\"" Oct 01 13:07:52 crc kubenswrapper[4727]: I1001 13:07:52.526451 4727 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/6ee1dacb-0b88-4de8-aa88-a24d1494ed94-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:07:52 crc kubenswrapper[4727]: I1001 13:07:52.526464 4727 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6ee1dacb-0b88-4de8-aa88-a24d1494ed94-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 01 13:07:52 crc kubenswrapper[4727]: I1001 13:07:52.924535 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ksfcf" event={"ID":"6ee1dacb-0b88-4de8-aa88-a24d1494ed94","Type":"ContainerDied","Data":"68bbb956f1d4b438287aa844f9edbced214efc8dc556352a1cdb63d95ad53def"} Oct 01 13:07:52 crc kubenswrapper[4727]: I1001 13:07:52.924590 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68bbb956f1d4b438287aa844f9edbced214efc8dc556352a1cdb63d95ad53def" Oct 01 13:07:52 crc kubenswrapper[4727]: I1001 13:07:52.924602 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ksfcf" Oct 01 13:07:52 crc kubenswrapper[4727]: I1001 13:07:52.995022 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-xx7dq"] Oct 01 13:07:52 crc kubenswrapper[4727]: E1001 13:07:52.995422 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ee1dacb-0b88-4de8-aa88-a24d1494ed94" containerName="ssh-known-hosts-edpm-deployment" Oct 01 13:07:52 crc kubenswrapper[4727]: I1001 13:07:52.995445 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee1dacb-0b88-4de8-aa88-a24d1494ed94" containerName="ssh-known-hosts-edpm-deployment" Oct 01 13:07:52 crc kubenswrapper[4727]: I1001 13:07:52.995640 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ee1dacb-0b88-4de8-aa88-a24d1494ed94" containerName="ssh-known-hosts-edpm-deployment" Oct 01 13:07:52 crc kubenswrapper[4727]: I1001 13:07:52.996257 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xx7dq" Oct 01 13:07:52 crc kubenswrapper[4727]: I1001 13:07:52.999012 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:07:52 crc kubenswrapper[4727]: I1001 13:07:52.999288 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jcjb6" Oct 01 13:07:52 crc kubenswrapper[4727]: I1001 13:07:52.999438 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:07:53 crc kubenswrapper[4727]: I1001 13:07:53.001689 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:07:53 crc kubenswrapper[4727]: I1001 13:07:53.006112 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-xx7dq"] Oct 01 13:07:53 crc kubenswrapper[4727]: I1001 13:07:53.136239 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xx7dq\" (UID: \"0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xx7dq" Oct 01 13:07:53 crc kubenswrapper[4727]: I1001 13:07:53.136365 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xx7dq\" (UID: \"0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xx7dq" Oct 01 13:07:53 crc kubenswrapper[4727]: I1001 13:07:53.136432 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnhpf\" (UniqueName: \"kubernetes.io/projected/0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a-kube-api-access-lnhpf\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xx7dq\" (UID: \"0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xx7dq" Oct 01 13:07:53 crc kubenswrapper[4727]: I1001 13:07:53.238334 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xx7dq\" (UID: \"0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xx7dq" Oct 01 13:07:53 crc kubenswrapper[4727]: I1001 13:07:53.238564 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xx7dq\" (UID: \"0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xx7dq" Oct 01 13:07:53 crc kubenswrapper[4727]: I1001 13:07:53.238643 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnhpf\" (UniqueName: \"kubernetes.io/projected/0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a-kube-api-access-lnhpf\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xx7dq\" (UID: \"0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xx7dq" Oct 01 13:07:53 crc kubenswrapper[4727]: I1001 13:07:53.244214 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xx7dq\" (UID: \"0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xx7dq" Oct 01 13:07:53 crc kubenswrapper[4727]: I1001 13:07:53.244277 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xx7dq\" (UID: \"0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xx7dq" Oct 01 13:07:53 crc kubenswrapper[4727]: I1001 13:07:53.256149 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnhpf\" (UniqueName: \"kubernetes.io/projected/0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a-kube-api-access-lnhpf\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xx7dq\" (UID: \"0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xx7dq" Oct 01 13:07:53 crc kubenswrapper[4727]: I1001 13:07:53.315508 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xx7dq" Oct 01 13:07:53 crc kubenswrapper[4727]: I1001 13:07:53.669922 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-xx7dq"] Oct 01 13:07:53 crc kubenswrapper[4727]: I1001 13:07:53.933830 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xx7dq" event={"ID":"0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a","Type":"ContainerStarted","Data":"c44b61d9a977522a06c08e3bd28bb57fc66e33494a03229411dc4c4c4aa65660"} Oct 01 13:07:54 crc kubenswrapper[4727]: I1001 13:07:54.946298 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xx7dq" event={"ID":"0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a","Type":"ContainerStarted","Data":"935a7d694a791def2a90fc39219ecf2aeed02169c1168688a3e1d4690e163739"} Oct 01 13:07:54 crc kubenswrapper[4727]: I1001 13:07:54.966544 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xx7dq" podStartSLOduration=2.252704326 podStartE2EDuration="2.966521157s" podCreationTimestamp="2025-10-01 13:07:52 +0000 UTC" firstStartedPulling="2025-10-01 13:07:53.677438013 +0000 UTC m=+1851.998792850" lastFinishedPulling="2025-10-01 13:07:54.391254844 +0000 UTC m=+1852.712609681" observedRunningTime="2025-10-01 13:07:54.959728672 +0000 UTC m=+1853.281083529" watchObservedRunningTime="2025-10-01 13:07:54.966521157 +0000 UTC m=+1853.287876014" Oct 01 13:08:03 crc kubenswrapper[4727]: I1001 13:08:03.372904 4727 scope.go:117] "RemoveContainer" containerID="34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" Oct 01 13:08:03 crc kubenswrapper[4727]: E1001 13:08:03.373688 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:08:04 crc kubenswrapper[4727]: I1001 13:08:04.021787 4727 generic.go:334] "Generic (PLEG): container finished" podID="0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a" containerID="935a7d694a791def2a90fc39219ecf2aeed02169c1168688a3e1d4690e163739" exitCode=0 Oct 01 13:08:04 crc kubenswrapper[4727]: I1001 13:08:04.021837 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xx7dq" event={"ID":"0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a","Type":"ContainerDied","Data":"935a7d694a791def2a90fc39219ecf2aeed02169c1168688a3e1d4690e163739"} Oct 01 13:08:05 crc kubenswrapper[4727]: I1001 13:08:05.482519 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xx7dq" Oct 01 13:08:05 crc kubenswrapper[4727]: I1001 13:08:05.596384 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnhpf\" (UniqueName: \"kubernetes.io/projected/0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a-kube-api-access-lnhpf\") pod \"0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a\" (UID: \"0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a\") " Oct 01 13:08:05 crc kubenswrapper[4727]: I1001 13:08:05.596479 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a-inventory\") pod \"0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a\" (UID: \"0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a\") " Oct 01 13:08:05 crc kubenswrapper[4727]: I1001 13:08:05.596577 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a-ssh-key\") pod \"0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a\" (UID: \"0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a\") " Oct 01 13:08:05 crc kubenswrapper[4727]: I1001 13:08:05.602459 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a-kube-api-access-lnhpf" (OuterVolumeSpecName: "kube-api-access-lnhpf") pod "0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a" (UID: "0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a"). InnerVolumeSpecName "kube-api-access-lnhpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:08:05 crc kubenswrapper[4727]: E1001 13:08:05.622265 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a-inventory podName:0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a nodeName:}" failed. No retries permitted until 2025-10-01 13:08:06.122237524 +0000 UTC m=+1864.443592361 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a-inventory") pod "0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a" (UID: "0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a") : error deleting /var/lib/kubelet/pods/0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a/volume-subpaths: remove /var/lib/kubelet/pods/0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a/volume-subpaths: no such file or directory Oct 01 13:08:05 crc kubenswrapper[4727]: I1001 13:08:05.626622 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a" (UID: "0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:08:05 crc kubenswrapper[4727]: I1001 13:08:05.698227 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnhpf\" (UniqueName: \"kubernetes.io/projected/0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a-kube-api-access-lnhpf\") on node \"crc\" DevicePath \"\"" Oct 01 13:08:05 crc kubenswrapper[4727]: I1001 13:08:05.698261 4727 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:08:06 crc kubenswrapper[4727]: I1001 13:08:06.041433 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xx7dq" event={"ID":"0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a","Type":"ContainerDied","Data":"c44b61d9a977522a06c08e3bd28bb57fc66e33494a03229411dc4c4c4aa65660"} Oct 01 13:08:06 crc kubenswrapper[4727]: I1001 13:08:06.041479 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c44b61d9a977522a06c08e3bd28bb57fc66e33494a03229411dc4c4c4aa65660" Oct 01 13:08:06 crc kubenswrapper[4727]: I1001 13:08:06.041452 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xx7dq" Oct 01 13:08:06 crc kubenswrapper[4727]: I1001 13:08:06.117570 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4"] Oct 01 13:08:06 crc kubenswrapper[4727]: E1001 13:08:06.117919 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:08:06 crc kubenswrapper[4727]: I1001 13:08:06.117935 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:08:06 crc kubenswrapper[4727]: I1001 13:08:06.118130 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:08:06 crc kubenswrapper[4727]: I1001 13:08:06.118678 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4" Oct 01 13:08:06 crc kubenswrapper[4727]: I1001 13:08:06.130564 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4"] Oct 01 13:08:06 crc kubenswrapper[4727]: I1001 13:08:06.207016 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a-inventory\") pod \"0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a\" (UID: \"0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a\") " Oct 01 13:08:06 crc kubenswrapper[4727]: I1001 13:08:06.207561 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9vgz\" (UniqueName: \"kubernetes.io/projected/0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c-kube-api-access-d9vgz\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4\" (UID: \"0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4" Oct 01 13:08:06 crc kubenswrapper[4727]: I1001 13:08:06.207616 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4\" (UID: \"0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4" Oct 01 13:08:06 crc kubenswrapper[4727]: I1001 13:08:06.207886 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4\" (UID: \"0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4" Oct 01 13:08:06 crc kubenswrapper[4727]: I1001 13:08:06.216255 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a-inventory" (OuterVolumeSpecName: "inventory") pod "0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a" (UID: "0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:08:06 crc kubenswrapper[4727]: I1001 13:08:06.309342 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9vgz\" (UniqueName: \"kubernetes.io/projected/0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c-kube-api-access-d9vgz\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4\" (UID: \"0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4" Oct 01 13:08:06 crc kubenswrapper[4727]: I1001 13:08:06.309452 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4\" (UID: \"0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4" Oct 01 13:08:06 crc kubenswrapper[4727]: I1001 13:08:06.309528 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4\" (UID: \"0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4" Oct 01 13:08:06 crc kubenswrapper[4727]: I1001 13:08:06.309598 4727 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:08:06 crc kubenswrapper[4727]: I1001 13:08:06.314567 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4\" (UID: \"0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4" Oct 01 13:08:06 crc kubenswrapper[4727]: I1001 13:08:06.323548 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4\" (UID: \"0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4" Oct 01 13:08:06 crc kubenswrapper[4727]: I1001 13:08:06.327730 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9vgz\" (UniqueName: \"kubernetes.io/projected/0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c-kube-api-access-d9vgz\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4\" (UID: \"0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4" Oct 01 13:08:06 crc kubenswrapper[4727]: I1001 13:08:06.467148 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4" Oct 01 13:08:06 crc kubenswrapper[4727]: I1001 13:08:06.994205 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4"] Oct 01 13:08:07 crc kubenswrapper[4727]: I1001 13:08:07.052815 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4" event={"ID":"0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c","Type":"ContainerStarted","Data":"9c185d87a630c9dded56eb26875aa029cc9b6cde11397d1adce544e49d67f72e"} Oct 01 13:08:08 crc kubenswrapper[4727]: I1001 13:08:08.061783 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4" event={"ID":"0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c","Type":"ContainerStarted","Data":"49d4c380b353232c4acdcf48305326767210ceb7e516a79fbd049ae9a92e9495"} Oct 01 13:08:08 crc kubenswrapper[4727]: I1001 13:08:08.081813 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4" podStartSLOduration=1.6029196959999998 podStartE2EDuration="2.081790552s" podCreationTimestamp="2025-10-01 13:08:06 +0000 UTC" firstStartedPulling="2025-10-01 13:08:07.016180132 +0000 UTC m=+1865.337534969" lastFinishedPulling="2025-10-01 13:08:07.495050988 +0000 UTC m=+1865.816405825" observedRunningTime="2025-10-01 13:08:08.075265336 +0000 UTC m=+1866.396620173" watchObservedRunningTime="2025-10-01 13:08:08.081790552 +0000 UTC m=+1866.403145389" Oct 01 13:08:14 crc kubenswrapper[4727]: I1001 13:08:14.372237 4727 scope.go:117] "RemoveContainer" containerID="34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" Oct 01 13:08:15 crc kubenswrapper[4727]: I1001 13:08:15.125602 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" event={"ID":"d18290ae-64a5-44a5-a704-90977d85852b","Type":"ContainerStarted","Data":"ffe3e19b0829e296a5017db986302c2eb85d3b8446c095789e9f37c908e4271f"} Oct 01 13:08:18 crc kubenswrapper[4727]: I1001 13:08:18.150678 4727 generic.go:334] "Generic (PLEG): container finished" podID="0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c" containerID="49d4c380b353232c4acdcf48305326767210ceb7e516a79fbd049ae9a92e9495" exitCode=0 Oct 01 13:08:18 crc kubenswrapper[4727]: I1001 13:08:18.150768 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4" event={"ID":"0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c","Type":"ContainerDied","Data":"49d4c380b353232c4acdcf48305326767210ceb7e516a79fbd049ae9a92e9495"} Oct 01 13:08:19 crc kubenswrapper[4727]: I1001 13:08:19.570905 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4" Oct 01 13:08:19 crc kubenswrapper[4727]: I1001 13:08:19.671369 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9vgz\" (UniqueName: \"kubernetes.io/projected/0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c-kube-api-access-d9vgz\") pod \"0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c\" (UID: \"0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c\") " Oct 01 13:08:19 crc kubenswrapper[4727]: I1001 13:08:19.671527 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c-inventory\") pod \"0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c\" (UID: \"0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c\") " Oct 01 13:08:19 crc kubenswrapper[4727]: I1001 13:08:19.671736 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c-ssh-key\") pod \"0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c\" (UID: \"0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c\") " Oct 01 13:08:19 crc kubenswrapper[4727]: I1001 13:08:19.678217 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c-kube-api-access-d9vgz" (OuterVolumeSpecName: "kube-api-access-d9vgz") pod "0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c" (UID: "0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c"). InnerVolumeSpecName "kube-api-access-d9vgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:08:19 crc kubenswrapper[4727]: I1001 13:08:19.703796 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c-inventory" (OuterVolumeSpecName: "inventory") pod "0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c" (UID: "0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:08:19 crc kubenswrapper[4727]: I1001 13:08:19.705925 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c" (UID: "0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:08:19 crc kubenswrapper[4727]: I1001 13:08:19.774953 4727 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:08:19 crc kubenswrapper[4727]: I1001 13:08:19.775012 4727 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:08:19 crc kubenswrapper[4727]: I1001 13:08:19.775028 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9vgz\" (UniqueName: \"kubernetes.io/projected/0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c-kube-api-access-d9vgz\") on node \"crc\" DevicePath \"\"" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.172283 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4" event={"ID":"0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c","Type":"ContainerDied","Data":"9c185d87a630c9dded56eb26875aa029cc9b6cde11397d1adce544e49d67f72e"} Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.172601 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c185d87a630c9dded56eb26875aa029cc9b6cde11397d1adce544e49d67f72e" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.172658 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.281548 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr"] Oct 01 13:08:20 crc kubenswrapper[4727]: E1001 13:08:20.282089 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.282112 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.282342 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.283072 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.286647 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.287216 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.287511 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.287677 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.287848 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.288036 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jcjb6" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.288208 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.288824 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.296758 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr"] Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.386869 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.386948 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.387051 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.387089 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.387117 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.387158 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.387292 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.387324 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.387466 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.387537 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.387571 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.387703 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.387774 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.387815 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff8cp\" (UniqueName: \"kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-kube-api-access-ff8cp\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.489421 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.489478 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.489527 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.489614 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.489641 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.489679 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.489714 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.489763 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.489794 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.489915 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.489947 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.489963 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff8cp\" (UniqueName: \"kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-kube-api-access-ff8cp\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.489985 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.490033 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.495452 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.496494 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.496586 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.497711 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.497986 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.498631 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.498740 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.499257 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.499700 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.499931 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.500567 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.501851 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.507870 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.510795 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff8cp\" (UniqueName: \"kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-kube-api-access-ff8cp\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.606383 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:08:20 crc kubenswrapper[4727]: I1001 13:08:20.982540 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr"] Oct 01 13:08:20 crc kubenswrapper[4727]: W1001 13:08:20.985841 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44e3fbd7_8cb7_462a_9990_f3e82b978c55.slice/crio-ce78a47da2f9582d1daeaefaee92c668af118e4e41f160b4996cfe0802914cc3 WatchSource:0}: Error finding container ce78a47da2f9582d1daeaefaee92c668af118e4e41f160b4996cfe0802914cc3: Status 404 returned error can't find the container with id ce78a47da2f9582d1daeaefaee92c668af118e4e41f160b4996cfe0802914cc3 Oct 01 13:08:21 crc kubenswrapper[4727]: I1001 13:08:21.186289 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" event={"ID":"44e3fbd7-8cb7-462a-9990-f3e82b978c55","Type":"ContainerStarted","Data":"ce78a47da2f9582d1daeaefaee92c668af118e4e41f160b4996cfe0802914cc3"} Oct 01 13:08:22 crc kubenswrapper[4727]: I1001 13:08:22.195132 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" event={"ID":"44e3fbd7-8cb7-462a-9990-f3e82b978c55","Type":"ContainerStarted","Data":"5b4073d65e1907c962c52146c61cf23c8887df71f6839052c0d9ab0b9bf8d207"} Oct 01 13:08:22 crc kubenswrapper[4727]: I1001 13:08:22.225504 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" podStartSLOduration=1.527326358 podStartE2EDuration="2.225485183s" podCreationTimestamp="2025-10-01 13:08:20 +0000 UTC" firstStartedPulling="2025-10-01 13:08:20.990458494 +0000 UTC m=+1879.311813331" lastFinishedPulling="2025-10-01 13:08:21.688617319 +0000 UTC m=+1880.009972156" observedRunningTime="2025-10-01 13:08:22.217234455 +0000 UTC m=+1880.538589302" watchObservedRunningTime="2025-10-01 13:08:22.225485183 +0000 UTC m=+1880.546840040" Oct 01 13:09:02 crc kubenswrapper[4727]: I1001 13:09:02.553953 4727 generic.go:334] "Generic (PLEG): container finished" podID="44e3fbd7-8cb7-462a-9990-f3e82b978c55" containerID="5b4073d65e1907c962c52146c61cf23c8887df71f6839052c0d9ab0b9bf8d207" exitCode=0 Oct 01 13:09:02 crc kubenswrapper[4727]: I1001 13:09:02.554046 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" event={"ID":"44e3fbd7-8cb7-462a-9990-f3e82b978c55","Type":"ContainerDied","Data":"5b4073d65e1907c962c52146c61cf23c8887df71f6839052c0d9ab0b9bf8d207"} Oct 01 13:09:03 crc kubenswrapper[4727]: I1001 13:09:03.989920 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.054108 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-inventory\") pod \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.054177 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-ssh-key\") pod \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.054199 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-openstack-edpm-ipam-ovn-default-certs-0\") pod \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.054243 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-neutron-metadata-combined-ca-bundle\") pod \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.054282 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff8cp\" (UniqueName: \"kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-kube-api-access-ff8cp\") pod \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.054314 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-nova-combined-ca-bundle\") pod \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.054375 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.054394 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-ovn-combined-ca-bundle\") pod \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.054408 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-telemetry-combined-ca-bundle\") pod \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.054460 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-repo-setup-combined-ca-bundle\") pod \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.054542 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-bootstrap-combined-ca-bundle\") pod \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.054615 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.054652 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.054700 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-libvirt-combined-ca-bundle\") pod \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\" (UID: \"44e3fbd7-8cb7-462a-9990-f3e82b978c55\") " Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.062334 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "44e3fbd7-8cb7-462a-9990-f3e82b978c55" (UID: "44e3fbd7-8cb7-462a-9990-f3e82b978c55"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.062403 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "44e3fbd7-8cb7-462a-9990-f3e82b978c55" (UID: "44e3fbd7-8cb7-462a-9990-f3e82b978c55"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.062248 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "44e3fbd7-8cb7-462a-9990-f3e82b978c55" (UID: "44e3fbd7-8cb7-462a-9990-f3e82b978c55"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.063867 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "44e3fbd7-8cb7-462a-9990-f3e82b978c55" (UID: "44e3fbd7-8cb7-462a-9990-f3e82b978c55"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.064539 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-kube-api-access-ff8cp" (OuterVolumeSpecName: "kube-api-access-ff8cp") pod "44e3fbd7-8cb7-462a-9990-f3e82b978c55" (UID: "44e3fbd7-8cb7-462a-9990-f3e82b978c55"). InnerVolumeSpecName "kube-api-access-ff8cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.065469 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "44e3fbd7-8cb7-462a-9990-f3e82b978c55" (UID: "44e3fbd7-8cb7-462a-9990-f3e82b978c55"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.066352 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "44e3fbd7-8cb7-462a-9990-f3e82b978c55" (UID: "44e3fbd7-8cb7-462a-9990-f3e82b978c55"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.066811 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "44e3fbd7-8cb7-462a-9990-f3e82b978c55" (UID: "44e3fbd7-8cb7-462a-9990-f3e82b978c55"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.066892 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "44e3fbd7-8cb7-462a-9990-f3e82b978c55" (UID: "44e3fbd7-8cb7-462a-9990-f3e82b978c55"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.071855 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "44e3fbd7-8cb7-462a-9990-f3e82b978c55" (UID: "44e3fbd7-8cb7-462a-9990-f3e82b978c55"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.072737 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "44e3fbd7-8cb7-462a-9990-f3e82b978c55" (UID: "44e3fbd7-8cb7-462a-9990-f3e82b978c55"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.089264 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "44e3fbd7-8cb7-462a-9990-f3e82b978c55" (UID: "44e3fbd7-8cb7-462a-9990-f3e82b978c55"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.097205 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-inventory" (OuterVolumeSpecName: "inventory") pod "44e3fbd7-8cb7-462a-9990-f3e82b978c55" (UID: "44e3fbd7-8cb7-462a-9990-f3e82b978c55"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.097540 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "44e3fbd7-8cb7-462a-9990-f3e82b978c55" (UID: "44e3fbd7-8cb7-462a-9990-f3e82b978c55"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.157267 4727 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.157302 4727 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.157313 4727 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.157324 4727 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.157340 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff8cp\" (UniqueName: \"kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-kube-api-access-ff8cp\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.157350 4727 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.157359 4727 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.157367 4727 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.157378 4727 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.157387 4727 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.157394 4727 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.157408 4727 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.157421 4727 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/44e3fbd7-8cb7-462a-9990-f3e82b978c55-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.157435 4727 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44e3fbd7-8cb7-462a-9990-f3e82b978c55-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.573514 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" event={"ID":"44e3fbd7-8cb7-462a-9990-f3e82b978c55","Type":"ContainerDied","Data":"ce78a47da2f9582d1daeaefaee92c668af118e4e41f160b4996cfe0802914cc3"} Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.574061 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce78a47da2f9582d1daeaefaee92c668af118e4e41f160b4996cfe0802914cc3" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.573618 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.667595 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvcv"] Oct 01 13:09:04 crc kubenswrapper[4727]: E1001 13:09:04.668090 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44e3fbd7-8cb7-462a-9990-f3e82b978c55" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.668427 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="44e3fbd7-8cb7-462a-9990-f3e82b978c55" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.668714 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="44e3fbd7-8cb7-462a-9990-f3e82b978c55" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.669492 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvcv" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.678792 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.679185 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.679376 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jcjb6" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.679426 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvcv"] Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.679611 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.681308 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.769763 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6510eaa-3789-48b0-94cc-c300c64714a2-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvcv\" (UID: \"f6510eaa-3789-48b0-94cc-c300c64714a2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvcv" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.769853 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6510eaa-3789-48b0-94cc-c300c64714a2-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvcv\" (UID: \"f6510eaa-3789-48b0-94cc-c300c64714a2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvcv" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.770078 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9tkn\" (UniqueName: \"kubernetes.io/projected/f6510eaa-3789-48b0-94cc-c300c64714a2-kube-api-access-n9tkn\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvcv\" (UID: \"f6510eaa-3789-48b0-94cc-c300c64714a2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvcv" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.770202 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f6510eaa-3789-48b0-94cc-c300c64714a2-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvcv\" (UID: \"f6510eaa-3789-48b0-94cc-c300c64714a2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvcv" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.770334 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6510eaa-3789-48b0-94cc-c300c64714a2-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvcv\" (UID: \"f6510eaa-3789-48b0-94cc-c300c64714a2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvcv" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.871959 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6510eaa-3789-48b0-94cc-c300c64714a2-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvcv\" (UID: \"f6510eaa-3789-48b0-94cc-c300c64714a2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvcv" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.872047 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6510eaa-3789-48b0-94cc-c300c64714a2-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvcv\" (UID: \"f6510eaa-3789-48b0-94cc-c300c64714a2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvcv" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.872097 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9tkn\" (UniqueName: \"kubernetes.io/projected/f6510eaa-3789-48b0-94cc-c300c64714a2-kube-api-access-n9tkn\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvcv\" (UID: \"f6510eaa-3789-48b0-94cc-c300c64714a2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvcv" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.872150 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f6510eaa-3789-48b0-94cc-c300c64714a2-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvcv\" (UID: \"f6510eaa-3789-48b0-94cc-c300c64714a2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvcv" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.872212 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6510eaa-3789-48b0-94cc-c300c64714a2-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvcv\" (UID: \"f6510eaa-3789-48b0-94cc-c300c64714a2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvcv" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.873443 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f6510eaa-3789-48b0-94cc-c300c64714a2-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvcv\" (UID: \"f6510eaa-3789-48b0-94cc-c300c64714a2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvcv" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.879832 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6510eaa-3789-48b0-94cc-c300c64714a2-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvcv\" (UID: \"f6510eaa-3789-48b0-94cc-c300c64714a2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvcv" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.879832 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6510eaa-3789-48b0-94cc-c300c64714a2-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvcv\" (UID: \"f6510eaa-3789-48b0-94cc-c300c64714a2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvcv" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.887879 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6510eaa-3789-48b0-94cc-c300c64714a2-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvcv\" (UID: \"f6510eaa-3789-48b0-94cc-c300c64714a2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvcv" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.891819 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9tkn\" (UniqueName: \"kubernetes.io/projected/f6510eaa-3789-48b0-94cc-c300c64714a2-kube-api-access-n9tkn\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9cvcv\" (UID: \"f6510eaa-3789-48b0-94cc-c300c64714a2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvcv" Oct 01 13:09:04 crc kubenswrapper[4727]: I1001 13:09:04.988271 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvcv" Oct 01 13:09:05 crc kubenswrapper[4727]: I1001 13:09:05.512164 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvcv"] Oct 01 13:09:05 crc kubenswrapper[4727]: I1001 13:09:05.583376 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvcv" event={"ID":"f6510eaa-3789-48b0-94cc-c300c64714a2","Type":"ContainerStarted","Data":"071fb5a716efb1ddffd9e6bf0b3c464911672d2f9bd7d2d5917b14039570e795"} Oct 01 13:09:06 crc kubenswrapper[4727]: I1001 13:09:06.596245 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvcv" event={"ID":"f6510eaa-3789-48b0-94cc-c300c64714a2","Type":"ContainerStarted","Data":"d882c3e9be631b3b3fad92012bf5fbf561cf1c8d24f2e428e55f7052b115742b"} Oct 01 13:09:06 crc kubenswrapper[4727]: I1001 13:09:06.616878 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvcv" podStartSLOduration=2.088854143 podStartE2EDuration="2.616841938s" podCreationTimestamp="2025-10-01 13:09:04 +0000 UTC" firstStartedPulling="2025-10-01 13:09:05.518222919 +0000 UTC m=+1923.839577766" lastFinishedPulling="2025-10-01 13:09:06.046210724 +0000 UTC m=+1924.367565561" observedRunningTime="2025-10-01 13:09:06.614154544 +0000 UTC m=+1924.935509391" watchObservedRunningTime="2025-10-01 13:09:06.616841938 +0000 UTC m=+1924.938196775" Oct 01 13:10:11 crc kubenswrapper[4727]: I1001 13:10:11.204606 4727 generic.go:334] "Generic (PLEG): container finished" podID="f6510eaa-3789-48b0-94cc-c300c64714a2" containerID="d882c3e9be631b3b3fad92012bf5fbf561cf1c8d24f2e428e55f7052b115742b" exitCode=0 Oct 01 13:10:11 crc kubenswrapper[4727]: I1001 13:10:11.204711 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvcv" event={"ID":"f6510eaa-3789-48b0-94cc-c300c64714a2","Type":"ContainerDied","Data":"d882c3e9be631b3b3fad92012bf5fbf561cf1c8d24f2e428e55f7052b115742b"} Oct 01 13:10:12 crc kubenswrapper[4727]: I1001 13:10:12.676232 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvcv" Oct 01 13:10:12 crc kubenswrapper[4727]: I1001 13:10:12.799553 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6510eaa-3789-48b0-94cc-c300c64714a2-inventory\") pod \"f6510eaa-3789-48b0-94cc-c300c64714a2\" (UID: \"f6510eaa-3789-48b0-94cc-c300c64714a2\") " Oct 01 13:10:12 crc kubenswrapper[4727]: I1001 13:10:12.800250 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9tkn\" (UniqueName: \"kubernetes.io/projected/f6510eaa-3789-48b0-94cc-c300c64714a2-kube-api-access-n9tkn\") pod \"f6510eaa-3789-48b0-94cc-c300c64714a2\" (UID: \"f6510eaa-3789-48b0-94cc-c300c64714a2\") " Oct 01 13:10:12 crc kubenswrapper[4727]: I1001 13:10:12.800535 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6510eaa-3789-48b0-94cc-c300c64714a2-ovn-combined-ca-bundle\") pod \"f6510eaa-3789-48b0-94cc-c300c64714a2\" (UID: \"f6510eaa-3789-48b0-94cc-c300c64714a2\") " Oct 01 13:10:12 crc kubenswrapper[4727]: I1001 13:10:12.800630 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6510eaa-3789-48b0-94cc-c300c64714a2-ssh-key\") pod \"f6510eaa-3789-48b0-94cc-c300c64714a2\" (UID: \"f6510eaa-3789-48b0-94cc-c300c64714a2\") " Oct 01 13:10:12 crc kubenswrapper[4727]: I1001 13:10:12.800804 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f6510eaa-3789-48b0-94cc-c300c64714a2-ovncontroller-config-0\") pod \"f6510eaa-3789-48b0-94cc-c300c64714a2\" (UID: \"f6510eaa-3789-48b0-94cc-c300c64714a2\") " Oct 01 13:10:12 crc kubenswrapper[4727]: I1001 13:10:12.806281 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6510eaa-3789-48b0-94cc-c300c64714a2-kube-api-access-n9tkn" (OuterVolumeSpecName: "kube-api-access-n9tkn") pod "f6510eaa-3789-48b0-94cc-c300c64714a2" (UID: "f6510eaa-3789-48b0-94cc-c300c64714a2"). InnerVolumeSpecName "kube-api-access-n9tkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:10:12 crc kubenswrapper[4727]: I1001 13:10:12.806718 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6510eaa-3789-48b0-94cc-c300c64714a2-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "f6510eaa-3789-48b0-94cc-c300c64714a2" (UID: "f6510eaa-3789-48b0-94cc-c300c64714a2"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:10:12 crc kubenswrapper[4727]: I1001 13:10:12.825921 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6510eaa-3789-48b0-94cc-c300c64714a2-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "f6510eaa-3789-48b0-94cc-c300c64714a2" (UID: "f6510eaa-3789-48b0-94cc-c300c64714a2"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:10:12 crc kubenswrapper[4727]: I1001 13:10:12.828417 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6510eaa-3789-48b0-94cc-c300c64714a2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f6510eaa-3789-48b0-94cc-c300c64714a2" (UID: "f6510eaa-3789-48b0-94cc-c300c64714a2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:10:12 crc kubenswrapper[4727]: I1001 13:10:12.828839 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6510eaa-3789-48b0-94cc-c300c64714a2-inventory" (OuterVolumeSpecName: "inventory") pod "f6510eaa-3789-48b0-94cc-c300c64714a2" (UID: "f6510eaa-3789-48b0-94cc-c300c64714a2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:10:12 crc kubenswrapper[4727]: I1001 13:10:12.903224 4727 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6510eaa-3789-48b0-94cc-c300c64714a2-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:12 crc kubenswrapper[4727]: I1001 13:10:12.903263 4727 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6510eaa-3789-48b0-94cc-c300c64714a2-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:12 crc kubenswrapper[4727]: I1001 13:10:12.903273 4727 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f6510eaa-3789-48b0-94cc-c300c64714a2-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:12 crc kubenswrapper[4727]: I1001 13:10:12.903282 4727 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6510eaa-3789-48b0-94cc-c300c64714a2-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:12 crc kubenswrapper[4727]: I1001 13:10:12.903290 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9tkn\" (UniqueName: \"kubernetes.io/projected/f6510eaa-3789-48b0-94cc-c300c64714a2-kube-api-access-n9tkn\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.225154 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvcv" event={"ID":"f6510eaa-3789-48b0-94cc-c300c64714a2","Type":"ContainerDied","Data":"071fb5a716efb1ddffd9e6bf0b3c464911672d2f9bd7d2d5917b14039570e795"} Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.225204 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="071fb5a716efb1ddffd9e6bf0b3c464911672d2f9bd7d2d5917b14039570e795" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.225470 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9cvcv" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.329446 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x"] Oct 01 13:10:13 crc kubenswrapper[4727]: E1001 13:10:13.329838 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6510eaa-3789-48b0-94cc-c300c64714a2" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.329857 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6510eaa-3789-48b0-94cc-c300c64714a2" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.330082 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6510eaa-3789-48b0-94cc-c300c64714a2" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.330801 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.334366 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.334819 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.334940 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.335012 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jcjb6" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.335023 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.335072 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.341049 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x"] Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.419160 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x\" (UID: \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.419205 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x\" (UID: \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.419256 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x\" (UID: \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.419524 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtwrq\" (UniqueName: \"kubernetes.io/projected/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-kube-api-access-mtwrq\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x\" (UID: \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.419711 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x\" (UID: \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.419754 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x\" (UID: \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.521529 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x\" (UID: \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.521613 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x\" (UID: \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.521682 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x\" (UID: \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.521766 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtwrq\" (UniqueName: \"kubernetes.io/projected/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-kube-api-access-mtwrq\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x\" (UID: \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.521866 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x\" (UID: \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.521893 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x\" (UID: \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.526140 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x\" (UID: \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.526815 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x\" (UID: \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.527251 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x\" (UID: \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.527295 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x\" (UID: \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.532303 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x\" (UID: \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.538663 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtwrq\" (UniqueName: \"kubernetes.io/projected/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-kube-api-access-mtwrq\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x\" (UID: \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x" Oct 01 13:10:13 crc kubenswrapper[4727]: I1001 13:10:13.654384 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x" Oct 01 13:10:14 crc kubenswrapper[4727]: I1001 13:10:14.190128 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x"] Oct 01 13:10:14 crc kubenswrapper[4727]: I1001 13:10:14.235424 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x" event={"ID":"55cb8c1d-db01-4bc4-9c27-3a1fed55d823","Type":"ContainerStarted","Data":"765e79cd592691a83a037a94f5ecf64096d020456da4024bf937e883099eaa7f"} Oct 01 13:10:15 crc kubenswrapper[4727]: I1001 13:10:15.247303 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x" event={"ID":"55cb8c1d-db01-4bc4-9c27-3a1fed55d823","Type":"ContainerStarted","Data":"2d0e2722f821470d8688166a84f1a67c2ff35bbb642cc5afc2b841896f4394c5"} Oct 01 13:10:15 crc kubenswrapper[4727]: I1001 13:10:15.277859 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x" podStartSLOduration=1.7543529960000002 podStartE2EDuration="2.277832841s" podCreationTimestamp="2025-10-01 13:10:13 +0000 UTC" firstStartedPulling="2025-10-01 13:10:14.203129212 +0000 UTC m=+1992.524484049" lastFinishedPulling="2025-10-01 13:10:14.726609047 +0000 UTC m=+1993.047963894" observedRunningTime="2025-10-01 13:10:15.263407968 +0000 UTC m=+1993.584762825" watchObservedRunningTime="2025-10-01 13:10:15.277832841 +0000 UTC m=+1993.599187678" Oct 01 13:10:33 crc kubenswrapper[4727]: I1001 13:10:33.291689 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:10:33 crc kubenswrapper[4727]: I1001 13:10:33.292451 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:11:03 crc kubenswrapper[4727]: I1001 13:11:03.292399 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:11:03 crc kubenswrapper[4727]: I1001 13:11:03.293137 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:11:07 crc kubenswrapper[4727]: I1001 13:11:07.765393 4727 generic.go:334] "Generic (PLEG): container finished" podID="55cb8c1d-db01-4bc4-9c27-3a1fed55d823" containerID="2d0e2722f821470d8688166a84f1a67c2ff35bbb642cc5afc2b841896f4394c5" exitCode=0 Oct 01 13:11:07 crc kubenswrapper[4727]: I1001 13:11:07.765449 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x" event={"ID":"55cb8c1d-db01-4bc4-9c27-3a1fed55d823","Type":"ContainerDied","Data":"2d0e2722f821470d8688166a84f1a67c2ff35bbb642cc5afc2b841896f4394c5"} Oct 01 13:11:09 crc kubenswrapper[4727]: I1001 13:11:09.293781 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x" Oct 01 13:11:09 crc kubenswrapper[4727]: I1001 13:11:09.441116 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-ssh-key\") pod \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\" (UID: \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\") " Oct 01 13:11:09 crc kubenswrapper[4727]: I1001 13:11:09.441288 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-neutron-ovn-metadata-agent-neutron-config-0\") pod \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\" (UID: \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\") " Oct 01 13:11:09 crc kubenswrapper[4727]: I1001 13:11:09.441317 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-nova-metadata-neutron-config-0\") pod \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\" (UID: \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\") " Oct 01 13:11:09 crc kubenswrapper[4727]: I1001 13:11:09.441363 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtwrq\" (UniqueName: \"kubernetes.io/projected/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-kube-api-access-mtwrq\") pod \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\" (UID: \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\") " Oct 01 13:11:09 crc kubenswrapper[4727]: I1001 13:11:09.441419 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-neutron-metadata-combined-ca-bundle\") pod \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\" (UID: \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\") " Oct 01 13:11:09 crc kubenswrapper[4727]: I1001 13:11:09.441440 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-inventory\") pod \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\" (UID: \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\") " Oct 01 13:11:09 crc kubenswrapper[4727]: I1001 13:11:09.448858 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "55cb8c1d-db01-4bc4-9c27-3a1fed55d823" (UID: "55cb8c1d-db01-4bc4-9c27-3a1fed55d823"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:11:09 crc kubenswrapper[4727]: I1001 13:11:09.449653 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-kube-api-access-mtwrq" (OuterVolumeSpecName: "kube-api-access-mtwrq") pod "55cb8c1d-db01-4bc4-9c27-3a1fed55d823" (UID: "55cb8c1d-db01-4bc4-9c27-3a1fed55d823"). InnerVolumeSpecName "kube-api-access-mtwrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:11:09 crc kubenswrapper[4727]: E1001 13:11:09.473163 4727 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-inventory podName:55cb8c1d-db01-4bc4-9c27-3a1fed55d823 nodeName:}" failed. No retries permitted until 2025-10-01 13:11:09.973129511 +0000 UTC m=+2048.294484348 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-inventory") pod "55cb8c1d-db01-4bc4-9c27-3a1fed55d823" (UID: "55cb8c1d-db01-4bc4-9c27-3a1fed55d823") : error deleting /var/lib/kubelet/pods/55cb8c1d-db01-4bc4-9c27-3a1fed55d823/volume-subpaths: remove /var/lib/kubelet/pods/55cb8c1d-db01-4bc4-9c27-3a1fed55d823/volume-subpaths: no such file or directory Oct 01 13:11:09 crc kubenswrapper[4727]: I1001 13:11:09.473494 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "55cb8c1d-db01-4bc4-9c27-3a1fed55d823" (UID: "55cb8c1d-db01-4bc4-9c27-3a1fed55d823"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:11:09 crc kubenswrapper[4727]: I1001 13:11:09.475417 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "55cb8c1d-db01-4bc4-9c27-3a1fed55d823" (UID: "55cb8c1d-db01-4bc4-9c27-3a1fed55d823"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:11:09 crc kubenswrapper[4727]: I1001 13:11:09.476484 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "55cb8c1d-db01-4bc4-9c27-3a1fed55d823" (UID: "55cb8c1d-db01-4bc4-9c27-3a1fed55d823"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:11:09 crc kubenswrapper[4727]: I1001 13:11:09.543456 4727 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:09 crc kubenswrapper[4727]: I1001 13:11:09.543496 4727 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:09 crc kubenswrapper[4727]: I1001 13:11:09.543512 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtwrq\" (UniqueName: \"kubernetes.io/projected/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-kube-api-access-mtwrq\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:09 crc kubenswrapper[4727]: I1001 13:11:09.543525 4727 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:09 crc kubenswrapper[4727]: I1001 13:11:09.543540 4727 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:09 crc kubenswrapper[4727]: I1001 13:11:09.786191 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x" event={"ID":"55cb8c1d-db01-4bc4-9c27-3a1fed55d823","Type":"ContainerDied","Data":"765e79cd592691a83a037a94f5ecf64096d020456da4024bf937e883099eaa7f"} Oct 01 13:11:09 crc kubenswrapper[4727]: I1001 13:11:09.786593 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="765e79cd592691a83a037a94f5ecf64096d020456da4024bf937e883099eaa7f" Oct 01 13:11:09 crc kubenswrapper[4727]: I1001 13:11:09.786296 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x" Oct 01 13:11:09 crc kubenswrapper[4727]: I1001 13:11:09.958219 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm"] Oct 01 13:11:09 crc kubenswrapper[4727]: E1001 13:11:09.959074 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55cb8c1d-db01-4bc4-9c27-3a1fed55d823" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 01 13:11:09 crc kubenswrapper[4727]: I1001 13:11:09.959227 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="55cb8c1d-db01-4bc4-9c27-3a1fed55d823" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 01 13:11:09 crc kubenswrapper[4727]: I1001 13:11:09.959697 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="55cb8c1d-db01-4bc4-9c27-3a1fed55d823" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 01 13:11:09 crc kubenswrapper[4727]: I1001 13:11:09.960817 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm" Oct 01 13:11:09 crc kubenswrapper[4727]: I1001 13:11:09.963354 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 01 13:11:09 crc kubenswrapper[4727]: I1001 13:11:09.973792 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm"] Oct 01 13:11:10 crc kubenswrapper[4727]: I1001 13:11:10.053589 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-inventory\") pod \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\" (UID: \"55cb8c1d-db01-4bc4-9c27-3a1fed55d823\") " Oct 01 13:11:10 crc kubenswrapper[4727]: I1001 13:11:10.054114 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm\" (UID: \"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm" Oct 01 13:11:10 crc kubenswrapper[4727]: I1001 13:11:10.054230 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm\" (UID: \"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm" Oct 01 13:11:10 crc kubenswrapper[4727]: I1001 13:11:10.054273 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm\" (UID: \"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm" Oct 01 13:11:10 crc kubenswrapper[4727]: I1001 13:11:10.054326 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm\" (UID: \"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm" Oct 01 13:11:10 crc kubenswrapper[4727]: I1001 13:11:10.054362 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpfhk\" (UniqueName: \"kubernetes.io/projected/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-kube-api-access-bpfhk\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm\" (UID: \"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm" Oct 01 13:11:10 crc kubenswrapper[4727]: I1001 13:11:10.060409 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-inventory" (OuterVolumeSpecName: "inventory") pod "55cb8c1d-db01-4bc4-9c27-3a1fed55d823" (UID: "55cb8c1d-db01-4bc4-9c27-3a1fed55d823"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:11:10 crc kubenswrapper[4727]: I1001 13:11:10.155946 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm\" (UID: \"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm" Oct 01 13:11:10 crc kubenswrapper[4727]: I1001 13:11:10.156046 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpfhk\" (UniqueName: \"kubernetes.io/projected/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-kube-api-access-bpfhk\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm\" (UID: \"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm" Oct 01 13:11:10 crc kubenswrapper[4727]: I1001 13:11:10.156102 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm\" (UID: \"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm" Oct 01 13:11:10 crc kubenswrapper[4727]: I1001 13:11:10.156225 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm\" (UID: \"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm" Oct 01 13:11:10 crc kubenswrapper[4727]: I1001 13:11:10.156273 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm\" (UID: \"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm" Oct 01 13:11:10 crc kubenswrapper[4727]: I1001 13:11:10.156338 4727 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55cb8c1d-db01-4bc4-9c27-3a1fed55d823-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:10 crc kubenswrapper[4727]: I1001 13:11:10.160551 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm\" (UID: \"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm" Oct 01 13:11:10 crc kubenswrapper[4727]: I1001 13:11:10.160800 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm\" (UID: \"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm" Oct 01 13:11:10 crc kubenswrapper[4727]: I1001 13:11:10.161952 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm\" (UID: \"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm" Oct 01 13:11:10 crc kubenswrapper[4727]: I1001 13:11:10.166992 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm\" (UID: \"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm" Oct 01 13:11:10 crc kubenswrapper[4727]: I1001 13:11:10.173797 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpfhk\" (UniqueName: \"kubernetes.io/projected/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-kube-api-access-bpfhk\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm\" (UID: \"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm" Oct 01 13:11:10 crc kubenswrapper[4727]: I1001 13:11:10.288631 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm" Oct 01 13:11:10 crc kubenswrapper[4727]: I1001 13:11:10.796650 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm"] Oct 01 13:11:11 crc kubenswrapper[4727]: I1001 13:11:11.803206 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm" event={"ID":"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d","Type":"ContainerStarted","Data":"e99134c3ea57d80b996fa1ef9489d3152498826ecb6daff639d87a43d2e54f16"} Oct 01 13:11:12 crc kubenswrapper[4727]: I1001 13:11:12.813066 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm" event={"ID":"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d","Type":"ContainerStarted","Data":"e77f97a548fd927d70323ea4afd8cd99d10a487fe2d1faffc9516515f772cfb4"} Oct 01 13:11:12 crc kubenswrapper[4727]: I1001 13:11:12.834402 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm" podStartSLOduration=3.025851278 podStartE2EDuration="3.83437906s" podCreationTimestamp="2025-10-01 13:11:09 +0000 UTC" firstStartedPulling="2025-10-01 13:11:10.796449383 +0000 UTC m=+2049.117804210" lastFinishedPulling="2025-10-01 13:11:11.604977155 +0000 UTC m=+2049.926331992" observedRunningTime="2025-10-01 13:11:12.826250985 +0000 UTC m=+2051.147605822" watchObservedRunningTime="2025-10-01 13:11:12.83437906 +0000 UTC m=+2051.155733897" Oct 01 13:11:33 crc kubenswrapper[4727]: I1001 13:11:33.292317 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:11:33 crc kubenswrapper[4727]: I1001 13:11:33.293080 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:11:33 crc kubenswrapper[4727]: I1001 13:11:33.293137 4727 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" Oct 01 13:11:33 crc kubenswrapper[4727]: I1001 13:11:33.293883 4727 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ffe3e19b0829e296a5017db986302c2eb85d3b8446c095789e9f37c908e4271f"} pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:11:33 crc kubenswrapper[4727]: I1001 13:11:33.293940 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" containerID="cri-o://ffe3e19b0829e296a5017db986302c2eb85d3b8446c095789e9f37c908e4271f" gracePeriod=600 Oct 01 13:11:33 crc kubenswrapper[4727]: E1001 13:11:33.623877 4727 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd18290ae_64a5_44a5_a704_90977d85852b.slice/crio-conmon-ffe3e19b0829e296a5017db986302c2eb85d3b8446c095789e9f37c908e4271f.scope\": RecentStats: unable to find data in memory cache]" Oct 01 13:11:34 crc kubenswrapper[4727]: I1001 13:11:34.011919 4727 generic.go:334] "Generic (PLEG): container finished" podID="d18290ae-64a5-44a5-a704-90977d85852b" containerID="ffe3e19b0829e296a5017db986302c2eb85d3b8446c095789e9f37c908e4271f" exitCode=0 Oct 01 13:11:34 crc kubenswrapper[4727]: I1001 13:11:34.012023 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" event={"ID":"d18290ae-64a5-44a5-a704-90977d85852b","Type":"ContainerDied","Data":"ffe3e19b0829e296a5017db986302c2eb85d3b8446c095789e9f37c908e4271f"} Oct 01 13:11:34 crc kubenswrapper[4727]: I1001 13:11:34.012478 4727 scope.go:117] "RemoveContainer" containerID="34f48aad840b50c93d50a055d6f92286fb82d9c3e2f84b8006a3e9cc7016eba6" Oct 01 13:11:35 crc kubenswrapper[4727]: I1001 13:11:35.022108 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" event={"ID":"d18290ae-64a5-44a5-a704-90977d85852b","Type":"ContainerStarted","Data":"fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2"} Oct 01 13:11:35 crc kubenswrapper[4727]: I1001 13:11:35.054228 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pbskb"] Oct 01 13:11:35 crc kubenswrapper[4727]: I1001 13:11:35.056202 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pbskb" Oct 01 13:11:35 crc kubenswrapper[4727]: I1001 13:11:35.070778 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pbskb"] Oct 01 13:11:35 crc kubenswrapper[4727]: I1001 13:11:35.141403 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e-catalog-content\") pod \"redhat-marketplace-pbskb\" (UID: \"2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e\") " pod="openshift-marketplace/redhat-marketplace-pbskb" Oct 01 13:11:35 crc kubenswrapper[4727]: I1001 13:11:35.141591 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj689\" (UniqueName: \"kubernetes.io/projected/2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e-kube-api-access-dj689\") pod \"redhat-marketplace-pbskb\" (UID: \"2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e\") " pod="openshift-marketplace/redhat-marketplace-pbskb" Oct 01 13:11:35 crc kubenswrapper[4727]: I1001 13:11:35.141658 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e-utilities\") pod \"redhat-marketplace-pbskb\" (UID: \"2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e\") " pod="openshift-marketplace/redhat-marketplace-pbskb" Oct 01 13:11:35 crc kubenswrapper[4727]: I1001 13:11:35.243703 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e-utilities\") pod \"redhat-marketplace-pbskb\" (UID: \"2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e\") " pod="openshift-marketplace/redhat-marketplace-pbskb" Oct 01 13:11:35 crc kubenswrapper[4727]: I1001 13:11:35.244113 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e-catalog-content\") pod \"redhat-marketplace-pbskb\" (UID: \"2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e\") " pod="openshift-marketplace/redhat-marketplace-pbskb" Oct 01 13:11:35 crc kubenswrapper[4727]: I1001 13:11:35.244265 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e-utilities\") pod \"redhat-marketplace-pbskb\" (UID: \"2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e\") " pod="openshift-marketplace/redhat-marketplace-pbskb" Oct 01 13:11:35 crc kubenswrapper[4727]: I1001 13:11:35.244442 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj689\" (UniqueName: \"kubernetes.io/projected/2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e-kube-api-access-dj689\") pod \"redhat-marketplace-pbskb\" (UID: \"2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e\") " pod="openshift-marketplace/redhat-marketplace-pbskb" Oct 01 13:11:35 crc kubenswrapper[4727]: I1001 13:11:35.244586 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e-catalog-content\") pod \"redhat-marketplace-pbskb\" (UID: \"2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e\") " pod="openshift-marketplace/redhat-marketplace-pbskb" Oct 01 13:11:35 crc kubenswrapper[4727]: I1001 13:11:35.271121 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj689\" (UniqueName: \"kubernetes.io/projected/2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e-kube-api-access-dj689\") pod \"redhat-marketplace-pbskb\" (UID: \"2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e\") " pod="openshift-marketplace/redhat-marketplace-pbskb" Oct 01 13:11:35 crc kubenswrapper[4727]: I1001 13:11:35.385589 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pbskb" Oct 01 13:11:35 crc kubenswrapper[4727]: W1001 13:11:35.846118 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b5229c7_4c32_4cb1_9a51_0e4111d2ad4e.slice/crio-fa2f9f0b58edfae5dedc56e0f85bf53c751ab2bf9553b33660eab5c961716a79 WatchSource:0}: Error finding container fa2f9f0b58edfae5dedc56e0f85bf53c751ab2bf9553b33660eab5c961716a79: Status 404 returned error can't find the container with id fa2f9f0b58edfae5dedc56e0f85bf53c751ab2bf9553b33660eab5c961716a79 Oct 01 13:11:35 crc kubenswrapper[4727]: I1001 13:11:35.847610 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pbskb"] Oct 01 13:11:36 crc kubenswrapper[4727]: I1001 13:11:36.033045 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pbskb" event={"ID":"2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e","Type":"ContainerStarted","Data":"fa2f9f0b58edfae5dedc56e0f85bf53c751ab2bf9553b33660eab5c961716a79"} Oct 01 13:11:37 crc kubenswrapper[4727]: I1001 13:11:37.047989 4727 generic.go:334] "Generic (PLEG): container finished" podID="2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e" containerID="b4378c420eb1be3d26bea000946f041132b0245155faa6ec2708cefbfd6d0b47" exitCode=0 Oct 01 13:11:37 crc kubenswrapper[4727]: I1001 13:11:37.048117 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pbskb" event={"ID":"2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e","Type":"ContainerDied","Data":"b4378c420eb1be3d26bea000946f041132b0245155faa6ec2708cefbfd6d0b47"} Oct 01 13:11:39 crc kubenswrapper[4727]: I1001 13:11:39.070514 4727 generic.go:334] "Generic (PLEG): container finished" podID="2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e" containerID="3246c379b161a23c2dec5f4f346fe8315e4aeb53da6fd9159a246207041c8155" exitCode=0 Oct 01 13:11:39 crc kubenswrapper[4727]: I1001 13:11:39.070645 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pbskb" event={"ID":"2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e","Type":"ContainerDied","Data":"3246c379b161a23c2dec5f4f346fe8315e4aeb53da6fd9159a246207041c8155"} Oct 01 13:11:40 crc kubenswrapper[4727]: I1001 13:11:40.082844 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pbskb" event={"ID":"2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e","Type":"ContainerStarted","Data":"76712177b5f1006e206ab1c62d8cc64ae60f8ac195f81be04b2602de8ce6b9a3"} Oct 01 13:11:40 crc kubenswrapper[4727]: I1001 13:11:40.103566 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pbskb" podStartSLOduration=2.645963546 podStartE2EDuration="5.103545718s" podCreationTimestamp="2025-10-01 13:11:35 +0000 UTC" firstStartedPulling="2025-10-01 13:11:37.050375873 +0000 UTC m=+2075.371730710" lastFinishedPulling="2025-10-01 13:11:39.507958055 +0000 UTC m=+2077.829312882" observedRunningTime="2025-10-01 13:11:40.101028818 +0000 UTC m=+2078.422383665" watchObservedRunningTime="2025-10-01 13:11:40.103545718 +0000 UTC m=+2078.424900565" Oct 01 13:11:45 crc kubenswrapper[4727]: I1001 13:11:45.386668 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pbskb" Oct 01 13:11:45 crc kubenswrapper[4727]: I1001 13:11:45.387239 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pbskb" Oct 01 13:11:45 crc kubenswrapper[4727]: I1001 13:11:45.446419 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pbskb" Oct 01 13:11:46 crc kubenswrapper[4727]: I1001 13:11:46.178206 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pbskb" Oct 01 13:11:46 crc kubenswrapper[4727]: I1001 13:11:46.227243 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pbskb"] Oct 01 13:11:48 crc kubenswrapper[4727]: I1001 13:11:48.149056 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pbskb" podUID="2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e" containerName="registry-server" containerID="cri-o://76712177b5f1006e206ab1c62d8cc64ae60f8ac195f81be04b2602de8ce6b9a3" gracePeriod=2 Oct 01 13:11:48 crc kubenswrapper[4727]: I1001 13:11:48.614038 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pbskb" Oct 01 13:11:48 crc kubenswrapper[4727]: I1001 13:11:48.714013 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e-catalog-content\") pod \"2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e\" (UID: \"2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e\") " Oct 01 13:11:48 crc kubenswrapper[4727]: I1001 13:11:48.714239 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj689\" (UniqueName: \"kubernetes.io/projected/2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e-kube-api-access-dj689\") pod \"2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e\" (UID: \"2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e\") " Oct 01 13:11:48 crc kubenswrapper[4727]: I1001 13:11:48.714297 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e-utilities\") pod \"2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e\" (UID: \"2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e\") " Oct 01 13:11:48 crc kubenswrapper[4727]: I1001 13:11:48.715165 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e-utilities" (OuterVolumeSpecName: "utilities") pod "2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e" (UID: "2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:11:48 crc kubenswrapper[4727]: I1001 13:11:48.716611 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:48 crc kubenswrapper[4727]: I1001 13:11:48.719786 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e-kube-api-access-dj689" (OuterVolumeSpecName: "kube-api-access-dj689") pod "2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e" (UID: "2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e"). InnerVolumeSpecName "kube-api-access-dj689". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:11:48 crc kubenswrapper[4727]: I1001 13:11:48.728920 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e" (UID: "2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:11:48 crc kubenswrapper[4727]: I1001 13:11:48.817928 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:48 crc kubenswrapper[4727]: I1001 13:11:48.817975 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj689\" (UniqueName: \"kubernetes.io/projected/2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e-kube-api-access-dj689\") on node \"crc\" DevicePath \"\"" Oct 01 13:11:49 crc kubenswrapper[4727]: I1001 13:11:49.160591 4727 generic.go:334] "Generic (PLEG): container finished" podID="2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e" containerID="76712177b5f1006e206ab1c62d8cc64ae60f8ac195f81be04b2602de8ce6b9a3" exitCode=0 Oct 01 13:11:49 crc kubenswrapper[4727]: I1001 13:11:49.160646 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pbskb" event={"ID":"2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e","Type":"ContainerDied","Data":"76712177b5f1006e206ab1c62d8cc64ae60f8ac195f81be04b2602de8ce6b9a3"} Oct 01 13:11:49 crc kubenswrapper[4727]: I1001 13:11:49.160699 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pbskb" event={"ID":"2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e","Type":"ContainerDied","Data":"fa2f9f0b58edfae5dedc56e0f85bf53c751ab2bf9553b33660eab5c961716a79"} Oct 01 13:11:49 crc kubenswrapper[4727]: I1001 13:11:49.160720 4727 scope.go:117] "RemoveContainer" containerID="76712177b5f1006e206ab1c62d8cc64ae60f8ac195f81be04b2602de8ce6b9a3" Oct 01 13:11:49 crc kubenswrapper[4727]: I1001 13:11:49.160788 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pbskb" Oct 01 13:11:49 crc kubenswrapper[4727]: I1001 13:11:49.198044 4727 scope.go:117] "RemoveContainer" containerID="3246c379b161a23c2dec5f4f346fe8315e4aeb53da6fd9159a246207041c8155" Oct 01 13:11:49 crc kubenswrapper[4727]: I1001 13:11:49.201218 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pbskb"] Oct 01 13:11:49 crc kubenswrapper[4727]: I1001 13:11:49.210943 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pbskb"] Oct 01 13:11:49 crc kubenswrapper[4727]: I1001 13:11:49.227413 4727 scope.go:117] "RemoveContainer" containerID="b4378c420eb1be3d26bea000946f041132b0245155faa6ec2708cefbfd6d0b47" Oct 01 13:11:49 crc kubenswrapper[4727]: I1001 13:11:49.275500 4727 scope.go:117] "RemoveContainer" containerID="76712177b5f1006e206ab1c62d8cc64ae60f8ac195f81be04b2602de8ce6b9a3" Oct 01 13:11:49 crc kubenswrapper[4727]: E1001 13:11:49.275963 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76712177b5f1006e206ab1c62d8cc64ae60f8ac195f81be04b2602de8ce6b9a3\": container with ID starting with 76712177b5f1006e206ab1c62d8cc64ae60f8ac195f81be04b2602de8ce6b9a3 not found: ID does not exist" containerID="76712177b5f1006e206ab1c62d8cc64ae60f8ac195f81be04b2602de8ce6b9a3" Oct 01 13:11:49 crc kubenswrapper[4727]: I1001 13:11:49.276016 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76712177b5f1006e206ab1c62d8cc64ae60f8ac195f81be04b2602de8ce6b9a3"} err="failed to get container status \"76712177b5f1006e206ab1c62d8cc64ae60f8ac195f81be04b2602de8ce6b9a3\": rpc error: code = NotFound desc = could not find container \"76712177b5f1006e206ab1c62d8cc64ae60f8ac195f81be04b2602de8ce6b9a3\": container with ID starting with 76712177b5f1006e206ab1c62d8cc64ae60f8ac195f81be04b2602de8ce6b9a3 not found: ID does not exist" Oct 01 13:11:49 crc kubenswrapper[4727]: I1001 13:11:49.276043 4727 scope.go:117] "RemoveContainer" containerID="3246c379b161a23c2dec5f4f346fe8315e4aeb53da6fd9159a246207041c8155" Oct 01 13:11:49 crc kubenswrapper[4727]: E1001 13:11:49.276666 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3246c379b161a23c2dec5f4f346fe8315e4aeb53da6fd9159a246207041c8155\": container with ID starting with 3246c379b161a23c2dec5f4f346fe8315e4aeb53da6fd9159a246207041c8155 not found: ID does not exist" containerID="3246c379b161a23c2dec5f4f346fe8315e4aeb53da6fd9159a246207041c8155" Oct 01 13:11:49 crc kubenswrapper[4727]: I1001 13:11:49.276694 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3246c379b161a23c2dec5f4f346fe8315e4aeb53da6fd9159a246207041c8155"} err="failed to get container status \"3246c379b161a23c2dec5f4f346fe8315e4aeb53da6fd9159a246207041c8155\": rpc error: code = NotFound desc = could not find container \"3246c379b161a23c2dec5f4f346fe8315e4aeb53da6fd9159a246207041c8155\": container with ID starting with 3246c379b161a23c2dec5f4f346fe8315e4aeb53da6fd9159a246207041c8155 not found: ID does not exist" Oct 01 13:11:49 crc kubenswrapper[4727]: I1001 13:11:49.276711 4727 scope.go:117] "RemoveContainer" containerID="b4378c420eb1be3d26bea000946f041132b0245155faa6ec2708cefbfd6d0b47" Oct 01 13:11:49 crc kubenswrapper[4727]: E1001 13:11:49.276959 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4378c420eb1be3d26bea000946f041132b0245155faa6ec2708cefbfd6d0b47\": container with ID starting with b4378c420eb1be3d26bea000946f041132b0245155faa6ec2708cefbfd6d0b47 not found: ID does not exist" containerID="b4378c420eb1be3d26bea000946f041132b0245155faa6ec2708cefbfd6d0b47" Oct 01 13:11:49 crc kubenswrapper[4727]: I1001 13:11:49.276991 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4378c420eb1be3d26bea000946f041132b0245155faa6ec2708cefbfd6d0b47"} err="failed to get container status \"b4378c420eb1be3d26bea000946f041132b0245155faa6ec2708cefbfd6d0b47\": rpc error: code = NotFound desc = could not find container \"b4378c420eb1be3d26bea000946f041132b0245155faa6ec2708cefbfd6d0b47\": container with ID starting with b4378c420eb1be3d26bea000946f041132b0245155faa6ec2708cefbfd6d0b47 not found: ID does not exist" Oct 01 13:11:50 crc kubenswrapper[4727]: I1001 13:11:50.382533 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e" path="/var/lib/kubelet/pods/2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e/volumes" Oct 01 13:13:08 crc kubenswrapper[4727]: I1001 13:13:08.007144 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wtksw"] Oct 01 13:13:08 crc kubenswrapper[4727]: E1001 13:13:08.008364 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e" containerName="extract-utilities" Oct 01 13:13:08 crc kubenswrapper[4727]: I1001 13:13:08.008383 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e" containerName="extract-utilities" Oct 01 13:13:08 crc kubenswrapper[4727]: E1001 13:13:08.008595 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e" containerName="registry-server" Oct 01 13:13:08 crc kubenswrapper[4727]: I1001 13:13:08.008604 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e" containerName="registry-server" Oct 01 13:13:08 crc kubenswrapper[4727]: E1001 13:13:08.008627 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e" containerName="extract-content" Oct 01 13:13:08 crc kubenswrapper[4727]: I1001 13:13:08.008635 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e" containerName="extract-content" Oct 01 13:13:08 crc kubenswrapper[4727]: I1001 13:13:08.008955 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b5229c7-4c32-4cb1-9a51-0e4111d2ad4e" containerName="registry-server" Oct 01 13:13:08 crc kubenswrapper[4727]: I1001 13:13:08.010980 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wtksw" Oct 01 13:13:08 crc kubenswrapper[4727]: I1001 13:13:08.019058 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wtksw"] Oct 01 13:13:08 crc kubenswrapper[4727]: I1001 13:13:08.092153 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/916076de-4629-4a5a-853f-1fd087012c25-utilities\") pod \"community-operators-wtksw\" (UID: \"916076de-4629-4a5a-853f-1fd087012c25\") " pod="openshift-marketplace/community-operators-wtksw" Oct 01 13:13:08 crc kubenswrapper[4727]: I1001 13:13:08.092710 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxc9t\" (UniqueName: \"kubernetes.io/projected/916076de-4629-4a5a-853f-1fd087012c25-kube-api-access-gxc9t\") pod \"community-operators-wtksw\" (UID: \"916076de-4629-4a5a-853f-1fd087012c25\") " pod="openshift-marketplace/community-operators-wtksw" Oct 01 13:13:08 crc kubenswrapper[4727]: I1001 13:13:08.092965 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/916076de-4629-4a5a-853f-1fd087012c25-catalog-content\") pod \"community-operators-wtksw\" (UID: \"916076de-4629-4a5a-853f-1fd087012c25\") " pod="openshift-marketplace/community-operators-wtksw" Oct 01 13:13:08 crc kubenswrapper[4727]: I1001 13:13:08.194615 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/916076de-4629-4a5a-853f-1fd087012c25-utilities\") pod \"community-operators-wtksw\" (UID: \"916076de-4629-4a5a-853f-1fd087012c25\") " pod="openshift-marketplace/community-operators-wtksw" Oct 01 13:13:08 crc kubenswrapper[4727]: I1001 13:13:08.194728 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxc9t\" (UniqueName: \"kubernetes.io/projected/916076de-4629-4a5a-853f-1fd087012c25-kube-api-access-gxc9t\") pod \"community-operators-wtksw\" (UID: \"916076de-4629-4a5a-853f-1fd087012c25\") " pod="openshift-marketplace/community-operators-wtksw" Oct 01 13:13:08 crc kubenswrapper[4727]: I1001 13:13:08.194794 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/916076de-4629-4a5a-853f-1fd087012c25-catalog-content\") pod \"community-operators-wtksw\" (UID: \"916076de-4629-4a5a-853f-1fd087012c25\") " pod="openshift-marketplace/community-operators-wtksw" Oct 01 13:13:08 crc kubenswrapper[4727]: I1001 13:13:08.195310 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/916076de-4629-4a5a-853f-1fd087012c25-utilities\") pod \"community-operators-wtksw\" (UID: \"916076de-4629-4a5a-853f-1fd087012c25\") " pod="openshift-marketplace/community-operators-wtksw" Oct 01 13:13:08 crc kubenswrapper[4727]: I1001 13:13:08.195383 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/916076de-4629-4a5a-853f-1fd087012c25-catalog-content\") pod \"community-operators-wtksw\" (UID: \"916076de-4629-4a5a-853f-1fd087012c25\") " pod="openshift-marketplace/community-operators-wtksw" Oct 01 13:13:08 crc kubenswrapper[4727]: I1001 13:13:08.220187 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxc9t\" (UniqueName: \"kubernetes.io/projected/916076de-4629-4a5a-853f-1fd087012c25-kube-api-access-gxc9t\") pod \"community-operators-wtksw\" (UID: \"916076de-4629-4a5a-853f-1fd087012c25\") " pod="openshift-marketplace/community-operators-wtksw" Oct 01 13:13:08 crc kubenswrapper[4727]: I1001 13:13:08.335339 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wtksw" Oct 01 13:13:08 crc kubenswrapper[4727]: I1001 13:13:08.922377 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wtksw"] Oct 01 13:13:09 crc kubenswrapper[4727]: I1001 13:13:09.885712 4727 generic.go:334] "Generic (PLEG): container finished" podID="916076de-4629-4a5a-853f-1fd087012c25" containerID="e30382411d3016ad061fff20ea0a503ee1dc41955d2b6f1c0b90e3c2784e60e4" exitCode=0 Oct 01 13:13:09 crc kubenswrapper[4727]: I1001 13:13:09.885795 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wtksw" event={"ID":"916076de-4629-4a5a-853f-1fd087012c25","Type":"ContainerDied","Data":"e30382411d3016ad061fff20ea0a503ee1dc41955d2b6f1c0b90e3c2784e60e4"} Oct 01 13:13:09 crc kubenswrapper[4727]: I1001 13:13:09.886151 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wtksw" event={"ID":"916076de-4629-4a5a-853f-1fd087012c25","Type":"ContainerStarted","Data":"517fb371086eebe5a12f9bae9242c264354ebd2bdb2f8ba7f01540c62041dc43"} Oct 01 13:13:09 crc kubenswrapper[4727]: I1001 13:13:09.888643 4727 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:13:11 crc kubenswrapper[4727]: I1001 13:13:11.905172 4727 generic.go:334] "Generic (PLEG): container finished" podID="916076de-4629-4a5a-853f-1fd087012c25" containerID="a778cd0a00201884a772c51e41d0c654358d83bef200a5bb0fb839e74530990e" exitCode=0 Oct 01 13:13:11 crc kubenswrapper[4727]: I1001 13:13:11.905297 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wtksw" event={"ID":"916076de-4629-4a5a-853f-1fd087012c25","Type":"ContainerDied","Data":"a778cd0a00201884a772c51e41d0c654358d83bef200a5bb0fb839e74530990e"} Oct 01 13:13:12 crc kubenswrapper[4727]: I1001 13:13:12.916329 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wtksw" event={"ID":"916076de-4629-4a5a-853f-1fd087012c25","Type":"ContainerStarted","Data":"ffa0fc1da072edf09f599b988d566e26c2995a1a9ae66ef1d964c4bc7b646330"} Oct 01 13:13:12 crc kubenswrapper[4727]: I1001 13:13:12.940395 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wtksw" podStartSLOduration=3.275530081 podStartE2EDuration="5.94037546s" podCreationTimestamp="2025-10-01 13:13:07 +0000 UTC" firstStartedPulling="2025-10-01 13:13:09.888337653 +0000 UTC m=+2168.209692500" lastFinishedPulling="2025-10-01 13:13:12.553182992 +0000 UTC m=+2170.874537879" observedRunningTime="2025-10-01 13:13:12.933592236 +0000 UTC m=+2171.254947073" watchObservedRunningTime="2025-10-01 13:13:12.94037546 +0000 UTC m=+2171.261730287" Oct 01 13:13:18 crc kubenswrapper[4727]: I1001 13:13:18.335693 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wtksw" Oct 01 13:13:18 crc kubenswrapper[4727]: I1001 13:13:18.336453 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wtksw" Oct 01 13:13:18 crc kubenswrapper[4727]: I1001 13:13:18.387096 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wtksw" Oct 01 13:13:19 crc kubenswrapper[4727]: I1001 13:13:19.031492 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wtksw" Oct 01 13:13:19 crc kubenswrapper[4727]: I1001 13:13:19.092290 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wtksw"] Oct 01 13:13:21 crc kubenswrapper[4727]: I1001 13:13:21.002443 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wtksw" podUID="916076de-4629-4a5a-853f-1fd087012c25" containerName="registry-server" containerID="cri-o://ffa0fc1da072edf09f599b988d566e26c2995a1a9ae66ef1d964c4bc7b646330" gracePeriod=2 Oct 01 13:13:21 crc kubenswrapper[4727]: I1001 13:13:21.538961 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wtksw" Oct 01 13:13:21 crc kubenswrapper[4727]: I1001 13:13:21.592763 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/916076de-4629-4a5a-853f-1fd087012c25-utilities\") pod \"916076de-4629-4a5a-853f-1fd087012c25\" (UID: \"916076de-4629-4a5a-853f-1fd087012c25\") " Oct 01 13:13:21 crc kubenswrapper[4727]: I1001 13:13:21.592900 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxc9t\" (UniqueName: \"kubernetes.io/projected/916076de-4629-4a5a-853f-1fd087012c25-kube-api-access-gxc9t\") pod \"916076de-4629-4a5a-853f-1fd087012c25\" (UID: \"916076de-4629-4a5a-853f-1fd087012c25\") " Oct 01 13:13:21 crc kubenswrapper[4727]: I1001 13:13:21.592929 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/916076de-4629-4a5a-853f-1fd087012c25-catalog-content\") pod \"916076de-4629-4a5a-853f-1fd087012c25\" (UID: \"916076de-4629-4a5a-853f-1fd087012c25\") " Oct 01 13:13:21 crc kubenswrapper[4727]: I1001 13:13:21.594204 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/916076de-4629-4a5a-853f-1fd087012c25-utilities" (OuterVolumeSpecName: "utilities") pod "916076de-4629-4a5a-853f-1fd087012c25" (UID: "916076de-4629-4a5a-853f-1fd087012c25"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:13:21 crc kubenswrapper[4727]: I1001 13:13:21.601424 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/916076de-4629-4a5a-853f-1fd087012c25-kube-api-access-gxc9t" (OuterVolumeSpecName: "kube-api-access-gxc9t") pod "916076de-4629-4a5a-853f-1fd087012c25" (UID: "916076de-4629-4a5a-853f-1fd087012c25"). InnerVolumeSpecName "kube-api-access-gxc9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:13:21 crc kubenswrapper[4727]: I1001 13:13:21.695110 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/916076de-4629-4a5a-853f-1fd087012c25-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:21 crc kubenswrapper[4727]: I1001 13:13:21.695161 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxc9t\" (UniqueName: \"kubernetes.io/projected/916076de-4629-4a5a-853f-1fd087012c25-kube-api-access-gxc9t\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:22 crc kubenswrapper[4727]: I1001 13:13:22.013902 4727 generic.go:334] "Generic (PLEG): container finished" podID="916076de-4629-4a5a-853f-1fd087012c25" containerID="ffa0fc1da072edf09f599b988d566e26c2995a1a9ae66ef1d964c4bc7b646330" exitCode=0 Oct 01 13:13:22 crc kubenswrapper[4727]: I1001 13:13:22.013953 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wtksw" event={"ID":"916076de-4629-4a5a-853f-1fd087012c25","Type":"ContainerDied","Data":"ffa0fc1da072edf09f599b988d566e26c2995a1a9ae66ef1d964c4bc7b646330"} Oct 01 13:13:22 crc kubenswrapper[4727]: I1001 13:13:22.013993 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wtksw" event={"ID":"916076de-4629-4a5a-853f-1fd087012c25","Type":"ContainerDied","Data":"517fb371086eebe5a12f9bae9242c264354ebd2bdb2f8ba7f01540c62041dc43"} Oct 01 13:13:22 crc kubenswrapper[4727]: I1001 13:13:22.014038 4727 scope.go:117] "RemoveContainer" containerID="ffa0fc1da072edf09f599b988d566e26c2995a1a9ae66ef1d964c4bc7b646330" Oct 01 13:13:22 crc kubenswrapper[4727]: I1001 13:13:22.014076 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wtksw" Oct 01 13:13:22 crc kubenswrapper[4727]: I1001 13:13:22.044424 4727 scope.go:117] "RemoveContainer" containerID="a778cd0a00201884a772c51e41d0c654358d83bef200a5bb0fb839e74530990e" Oct 01 13:13:22 crc kubenswrapper[4727]: I1001 13:13:22.070965 4727 scope.go:117] "RemoveContainer" containerID="e30382411d3016ad061fff20ea0a503ee1dc41955d2b6f1c0b90e3c2784e60e4" Oct 01 13:13:22 crc kubenswrapper[4727]: I1001 13:13:22.113303 4727 scope.go:117] "RemoveContainer" containerID="ffa0fc1da072edf09f599b988d566e26c2995a1a9ae66ef1d964c4bc7b646330" Oct 01 13:13:22 crc kubenswrapper[4727]: E1001 13:13:22.113939 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffa0fc1da072edf09f599b988d566e26c2995a1a9ae66ef1d964c4bc7b646330\": container with ID starting with ffa0fc1da072edf09f599b988d566e26c2995a1a9ae66ef1d964c4bc7b646330 not found: ID does not exist" containerID="ffa0fc1da072edf09f599b988d566e26c2995a1a9ae66ef1d964c4bc7b646330" Oct 01 13:13:22 crc kubenswrapper[4727]: I1001 13:13:22.114047 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffa0fc1da072edf09f599b988d566e26c2995a1a9ae66ef1d964c4bc7b646330"} err="failed to get container status \"ffa0fc1da072edf09f599b988d566e26c2995a1a9ae66ef1d964c4bc7b646330\": rpc error: code = NotFound desc = could not find container \"ffa0fc1da072edf09f599b988d566e26c2995a1a9ae66ef1d964c4bc7b646330\": container with ID starting with ffa0fc1da072edf09f599b988d566e26c2995a1a9ae66ef1d964c4bc7b646330 not found: ID does not exist" Oct 01 13:13:22 crc kubenswrapper[4727]: I1001 13:13:22.114090 4727 scope.go:117] "RemoveContainer" containerID="a778cd0a00201884a772c51e41d0c654358d83bef200a5bb0fb839e74530990e" Oct 01 13:13:22 crc kubenswrapper[4727]: E1001 13:13:22.114621 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a778cd0a00201884a772c51e41d0c654358d83bef200a5bb0fb839e74530990e\": container with ID starting with a778cd0a00201884a772c51e41d0c654358d83bef200a5bb0fb839e74530990e not found: ID does not exist" containerID="a778cd0a00201884a772c51e41d0c654358d83bef200a5bb0fb839e74530990e" Oct 01 13:13:22 crc kubenswrapper[4727]: I1001 13:13:22.114676 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a778cd0a00201884a772c51e41d0c654358d83bef200a5bb0fb839e74530990e"} err="failed to get container status \"a778cd0a00201884a772c51e41d0c654358d83bef200a5bb0fb839e74530990e\": rpc error: code = NotFound desc = could not find container \"a778cd0a00201884a772c51e41d0c654358d83bef200a5bb0fb839e74530990e\": container with ID starting with a778cd0a00201884a772c51e41d0c654358d83bef200a5bb0fb839e74530990e not found: ID does not exist" Oct 01 13:13:22 crc kubenswrapper[4727]: I1001 13:13:22.114712 4727 scope.go:117] "RemoveContainer" containerID="e30382411d3016ad061fff20ea0a503ee1dc41955d2b6f1c0b90e3c2784e60e4" Oct 01 13:13:22 crc kubenswrapper[4727]: E1001 13:13:22.115723 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e30382411d3016ad061fff20ea0a503ee1dc41955d2b6f1c0b90e3c2784e60e4\": container with ID starting with e30382411d3016ad061fff20ea0a503ee1dc41955d2b6f1c0b90e3c2784e60e4 not found: ID does not exist" containerID="e30382411d3016ad061fff20ea0a503ee1dc41955d2b6f1c0b90e3c2784e60e4" Oct 01 13:13:22 crc kubenswrapper[4727]: I1001 13:13:22.115769 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e30382411d3016ad061fff20ea0a503ee1dc41955d2b6f1c0b90e3c2784e60e4"} err="failed to get container status \"e30382411d3016ad061fff20ea0a503ee1dc41955d2b6f1c0b90e3c2784e60e4\": rpc error: code = NotFound desc = could not find container \"e30382411d3016ad061fff20ea0a503ee1dc41955d2b6f1c0b90e3c2784e60e4\": container with ID starting with e30382411d3016ad061fff20ea0a503ee1dc41955d2b6f1c0b90e3c2784e60e4 not found: ID does not exist" Oct 01 13:13:22 crc kubenswrapper[4727]: I1001 13:13:22.211020 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/916076de-4629-4a5a-853f-1fd087012c25-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "916076de-4629-4a5a-853f-1fd087012c25" (UID: "916076de-4629-4a5a-853f-1fd087012c25"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:13:22 crc kubenswrapper[4727]: I1001 13:13:22.308924 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/916076de-4629-4a5a-853f-1fd087012c25-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:22 crc kubenswrapper[4727]: I1001 13:13:22.354215 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wtksw"] Oct 01 13:13:22 crc kubenswrapper[4727]: I1001 13:13:22.365752 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wtksw"] Oct 01 13:13:22 crc kubenswrapper[4727]: I1001 13:13:22.385208 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="916076de-4629-4a5a-853f-1fd087012c25" path="/var/lib/kubelet/pods/916076de-4629-4a5a-853f-1fd087012c25/volumes" Oct 01 13:13:43 crc kubenswrapper[4727]: I1001 13:13:43.150546 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tg2p5"] Oct 01 13:13:43 crc kubenswrapper[4727]: E1001 13:13:43.151541 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="916076de-4629-4a5a-853f-1fd087012c25" containerName="extract-utilities" Oct 01 13:13:43 crc kubenswrapper[4727]: I1001 13:13:43.151554 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="916076de-4629-4a5a-853f-1fd087012c25" containerName="extract-utilities" Oct 01 13:13:43 crc kubenswrapper[4727]: E1001 13:13:43.151625 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="916076de-4629-4a5a-853f-1fd087012c25" containerName="extract-content" Oct 01 13:13:43 crc kubenswrapper[4727]: I1001 13:13:43.151634 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="916076de-4629-4a5a-853f-1fd087012c25" containerName="extract-content" Oct 01 13:13:43 crc kubenswrapper[4727]: E1001 13:13:43.151643 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="916076de-4629-4a5a-853f-1fd087012c25" containerName="registry-server" Oct 01 13:13:43 crc kubenswrapper[4727]: I1001 13:13:43.151649 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="916076de-4629-4a5a-853f-1fd087012c25" containerName="registry-server" Oct 01 13:13:43 crc kubenswrapper[4727]: I1001 13:13:43.151839 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="916076de-4629-4a5a-853f-1fd087012c25" containerName="registry-server" Oct 01 13:13:43 crc kubenswrapper[4727]: I1001 13:13:43.153369 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tg2p5" Oct 01 13:13:43 crc kubenswrapper[4727]: I1001 13:13:43.162877 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tg2p5"] Oct 01 13:13:43 crc kubenswrapper[4727]: I1001 13:13:43.197921 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd07db42-b958-40ac-bc17-5046c9619784-catalog-content\") pod \"redhat-operators-tg2p5\" (UID: \"fd07db42-b958-40ac-bc17-5046c9619784\") " pod="openshift-marketplace/redhat-operators-tg2p5" Oct 01 13:13:43 crc kubenswrapper[4727]: I1001 13:13:43.198346 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtzdc\" (UniqueName: \"kubernetes.io/projected/fd07db42-b958-40ac-bc17-5046c9619784-kube-api-access-vtzdc\") pod \"redhat-operators-tg2p5\" (UID: \"fd07db42-b958-40ac-bc17-5046c9619784\") " pod="openshift-marketplace/redhat-operators-tg2p5" Oct 01 13:13:43 crc kubenswrapper[4727]: I1001 13:13:43.198408 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd07db42-b958-40ac-bc17-5046c9619784-utilities\") pod \"redhat-operators-tg2p5\" (UID: \"fd07db42-b958-40ac-bc17-5046c9619784\") " pod="openshift-marketplace/redhat-operators-tg2p5" Oct 01 13:13:43 crc kubenswrapper[4727]: I1001 13:13:43.300489 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd07db42-b958-40ac-bc17-5046c9619784-catalog-content\") pod \"redhat-operators-tg2p5\" (UID: \"fd07db42-b958-40ac-bc17-5046c9619784\") " pod="openshift-marketplace/redhat-operators-tg2p5" Oct 01 13:13:43 crc kubenswrapper[4727]: I1001 13:13:43.300597 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtzdc\" (UniqueName: \"kubernetes.io/projected/fd07db42-b958-40ac-bc17-5046c9619784-kube-api-access-vtzdc\") pod \"redhat-operators-tg2p5\" (UID: \"fd07db42-b958-40ac-bc17-5046c9619784\") " pod="openshift-marketplace/redhat-operators-tg2p5" Oct 01 13:13:43 crc kubenswrapper[4727]: I1001 13:13:43.300642 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd07db42-b958-40ac-bc17-5046c9619784-utilities\") pod \"redhat-operators-tg2p5\" (UID: \"fd07db42-b958-40ac-bc17-5046c9619784\") " pod="openshift-marketplace/redhat-operators-tg2p5" Oct 01 13:13:43 crc kubenswrapper[4727]: I1001 13:13:43.301108 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd07db42-b958-40ac-bc17-5046c9619784-catalog-content\") pod \"redhat-operators-tg2p5\" (UID: \"fd07db42-b958-40ac-bc17-5046c9619784\") " pod="openshift-marketplace/redhat-operators-tg2p5" Oct 01 13:13:43 crc kubenswrapper[4727]: I1001 13:13:43.301166 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd07db42-b958-40ac-bc17-5046c9619784-utilities\") pod \"redhat-operators-tg2p5\" (UID: \"fd07db42-b958-40ac-bc17-5046c9619784\") " pod="openshift-marketplace/redhat-operators-tg2p5" Oct 01 13:13:43 crc kubenswrapper[4727]: I1001 13:13:43.322543 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtzdc\" (UniqueName: \"kubernetes.io/projected/fd07db42-b958-40ac-bc17-5046c9619784-kube-api-access-vtzdc\") pod \"redhat-operators-tg2p5\" (UID: \"fd07db42-b958-40ac-bc17-5046c9619784\") " pod="openshift-marketplace/redhat-operators-tg2p5" Oct 01 13:13:43 crc kubenswrapper[4727]: I1001 13:13:43.475018 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tg2p5" Oct 01 13:13:43 crc kubenswrapper[4727]: I1001 13:13:43.983580 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tg2p5"] Oct 01 13:13:44 crc kubenswrapper[4727]: I1001 13:13:44.226763 4727 generic.go:334] "Generic (PLEG): container finished" podID="fd07db42-b958-40ac-bc17-5046c9619784" containerID="a2877d0e2b7d6986f1fd5bb5dad39577482461175836fd3b0393f0143701b120" exitCode=0 Oct 01 13:13:44 crc kubenswrapper[4727]: I1001 13:13:44.226910 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tg2p5" event={"ID":"fd07db42-b958-40ac-bc17-5046c9619784","Type":"ContainerDied","Data":"a2877d0e2b7d6986f1fd5bb5dad39577482461175836fd3b0393f0143701b120"} Oct 01 13:13:44 crc kubenswrapper[4727]: I1001 13:13:44.227060 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tg2p5" event={"ID":"fd07db42-b958-40ac-bc17-5046c9619784","Type":"ContainerStarted","Data":"de606eff753cbfd0140174f35148e4d00f34a81f47b92b20091e998d9e068bbc"} Oct 01 13:13:46 crc kubenswrapper[4727]: I1001 13:13:46.251135 4727 generic.go:334] "Generic (PLEG): container finished" podID="fd07db42-b958-40ac-bc17-5046c9619784" containerID="a8b81da80644c1d720dc70a86f12f1f3abae41572683feec13dce0e634282d3e" exitCode=0 Oct 01 13:13:46 crc kubenswrapper[4727]: I1001 13:13:46.251897 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tg2p5" event={"ID":"fd07db42-b958-40ac-bc17-5046c9619784","Type":"ContainerDied","Data":"a8b81da80644c1d720dc70a86f12f1f3abae41572683feec13dce0e634282d3e"} Oct 01 13:13:47 crc kubenswrapper[4727]: I1001 13:13:47.262947 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tg2p5" event={"ID":"fd07db42-b958-40ac-bc17-5046c9619784","Type":"ContainerStarted","Data":"cced634f6b8da5cf87a38df67331baa23df2f57309bb4864850cdd9d51d6c3d3"} Oct 01 13:13:48 crc kubenswrapper[4727]: I1001 13:13:48.295185 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tg2p5" podStartSLOduration=2.7961821110000002 podStartE2EDuration="5.29516583s" podCreationTimestamp="2025-10-01 13:13:43 +0000 UTC" firstStartedPulling="2025-10-01 13:13:44.228425833 +0000 UTC m=+2202.549780670" lastFinishedPulling="2025-10-01 13:13:46.727409552 +0000 UTC m=+2205.048764389" observedRunningTime="2025-10-01 13:13:48.289631565 +0000 UTC m=+2206.610986402" watchObservedRunningTime="2025-10-01 13:13:48.29516583 +0000 UTC m=+2206.616520697" Oct 01 13:13:53 crc kubenswrapper[4727]: I1001 13:13:53.475097 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tg2p5" Oct 01 13:13:53 crc kubenswrapper[4727]: I1001 13:13:53.476945 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tg2p5" Oct 01 13:13:53 crc kubenswrapper[4727]: I1001 13:13:53.540100 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tg2p5" Oct 01 13:13:54 crc kubenswrapper[4727]: I1001 13:13:54.385904 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tg2p5" Oct 01 13:13:54 crc kubenswrapper[4727]: I1001 13:13:54.433693 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tg2p5"] Oct 01 13:13:56 crc kubenswrapper[4727]: I1001 13:13:56.359290 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tg2p5" podUID="fd07db42-b958-40ac-bc17-5046c9619784" containerName="registry-server" containerID="cri-o://cced634f6b8da5cf87a38df67331baa23df2f57309bb4864850cdd9d51d6c3d3" gracePeriod=2 Oct 01 13:13:56 crc kubenswrapper[4727]: I1001 13:13:56.819611 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tg2p5" Oct 01 13:13:56 crc kubenswrapper[4727]: I1001 13:13:56.858859 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtzdc\" (UniqueName: \"kubernetes.io/projected/fd07db42-b958-40ac-bc17-5046c9619784-kube-api-access-vtzdc\") pod \"fd07db42-b958-40ac-bc17-5046c9619784\" (UID: \"fd07db42-b958-40ac-bc17-5046c9619784\") " Oct 01 13:13:56 crc kubenswrapper[4727]: I1001 13:13:56.858916 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd07db42-b958-40ac-bc17-5046c9619784-catalog-content\") pod \"fd07db42-b958-40ac-bc17-5046c9619784\" (UID: \"fd07db42-b958-40ac-bc17-5046c9619784\") " Oct 01 13:13:56 crc kubenswrapper[4727]: I1001 13:13:56.858994 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd07db42-b958-40ac-bc17-5046c9619784-utilities\") pod \"fd07db42-b958-40ac-bc17-5046c9619784\" (UID: \"fd07db42-b958-40ac-bc17-5046c9619784\") " Oct 01 13:13:56 crc kubenswrapper[4727]: I1001 13:13:56.860241 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd07db42-b958-40ac-bc17-5046c9619784-utilities" (OuterVolumeSpecName: "utilities") pod "fd07db42-b958-40ac-bc17-5046c9619784" (UID: "fd07db42-b958-40ac-bc17-5046c9619784"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:13:56 crc kubenswrapper[4727]: I1001 13:13:56.864611 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd07db42-b958-40ac-bc17-5046c9619784-kube-api-access-vtzdc" (OuterVolumeSpecName: "kube-api-access-vtzdc") pod "fd07db42-b958-40ac-bc17-5046c9619784" (UID: "fd07db42-b958-40ac-bc17-5046c9619784"). InnerVolumeSpecName "kube-api-access-vtzdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:13:56 crc kubenswrapper[4727]: I1001 13:13:56.951326 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd07db42-b958-40ac-bc17-5046c9619784-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd07db42-b958-40ac-bc17-5046c9619784" (UID: "fd07db42-b958-40ac-bc17-5046c9619784"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:13:56 crc kubenswrapper[4727]: I1001 13:13:56.961430 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtzdc\" (UniqueName: \"kubernetes.io/projected/fd07db42-b958-40ac-bc17-5046c9619784-kube-api-access-vtzdc\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:56 crc kubenswrapper[4727]: I1001 13:13:56.961473 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd07db42-b958-40ac-bc17-5046c9619784-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:56 crc kubenswrapper[4727]: I1001 13:13:56.961488 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd07db42-b958-40ac-bc17-5046c9619784-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:57 crc kubenswrapper[4727]: I1001 13:13:57.373361 4727 generic.go:334] "Generic (PLEG): container finished" podID="fd07db42-b958-40ac-bc17-5046c9619784" containerID="cced634f6b8da5cf87a38df67331baa23df2f57309bb4864850cdd9d51d6c3d3" exitCode=0 Oct 01 13:13:57 crc kubenswrapper[4727]: I1001 13:13:57.373511 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tg2p5" Oct 01 13:13:57 crc kubenswrapper[4727]: I1001 13:13:57.373531 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tg2p5" event={"ID":"fd07db42-b958-40ac-bc17-5046c9619784","Type":"ContainerDied","Data":"cced634f6b8da5cf87a38df67331baa23df2f57309bb4864850cdd9d51d6c3d3"} Oct 01 13:13:57 crc kubenswrapper[4727]: I1001 13:13:57.374039 4727 scope.go:117] "RemoveContainer" containerID="cced634f6b8da5cf87a38df67331baa23df2f57309bb4864850cdd9d51d6c3d3" Oct 01 13:13:57 crc kubenswrapper[4727]: I1001 13:13:57.373991 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tg2p5" event={"ID":"fd07db42-b958-40ac-bc17-5046c9619784","Type":"ContainerDied","Data":"de606eff753cbfd0140174f35148e4d00f34a81f47b92b20091e998d9e068bbc"} Oct 01 13:13:57 crc kubenswrapper[4727]: I1001 13:13:57.407751 4727 scope.go:117] "RemoveContainer" containerID="a8b81da80644c1d720dc70a86f12f1f3abae41572683feec13dce0e634282d3e" Oct 01 13:13:57 crc kubenswrapper[4727]: I1001 13:13:57.428677 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tg2p5"] Oct 01 13:13:57 crc kubenswrapper[4727]: I1001 13:13:57.437626 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tg2p5"] Oct 01 13:13:57 crc kubenswrapper[4727]: I1001 13:13:57.455478 4727 scope.go:117] "RemoveContainer" containerID="a2877d0e2b7d6986f1fd5bb5dad39577482461175836fd3b0393f0143701b120" Oct 01 13:13:57 crc kubenswrapper[4727]: I1001 13:13:57.497135 4727 scope.go:117] "RemoveContainer" containerID="cced634f6b8da5cf87a38df67331baa23df2f57309bb4864850cdd9d51d6c3d3" Oct 01 13:13:57 crc kubenswrapper[4727]: E1001 13:13:57.499694 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cced634f6b8da5cf87a38df67331baa23df2f57309bb4864850cdd9d51d6c3d3\": container with ID starting with cced634f6b8da5cf87a38df67331baa23df2f57309bb4864850cdd9d51d6c3d3 not found: ID does not exist" containerID="cced634f6b8da5cf87a38df67331baa23df2f57309bb4864850cdd9d51d6c3d3" Oct 01 13:13:57 crc kubenswrapper[4727]: I1001 13:13:57.499752 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cced634f6b8da5cf87a38df67331baa23df2f57309bb4864850cdd9d51d6c3d3"} err="failed to get container status \"cced634f6b8da5cf87a38df67331baa23df2f57309bb4864850cdd9d51d6c3d3\": rpc error: code = NotFound desc = could not find container \"cced634f6b8da5cf87a38df67331baa23df2f57309bb4864850cdd9d51d6c3d3\": container with ID starting with cced634f6b8da5cf87a38df67331baa23df2f57309bb4864850cdd9d51d6c3d3 not found: ID does not exist" Oct 01 13:13:57 crc kubenswrapper[4727]: I1001 13:13:57.499791 4727 scope.go:117] "RemoveContainer" containerID="a8b81da80644c1d720dc70a86f12f1f3abae41572683feec13dce0e634282d3e" Oct 01 13:13:57 crc kubenswrapper[4727]: E1001 13:13:57.500394 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8b81da80644c1d720dc70a86f12f1f3abae41572683feec13dce0e634282d3e\": container with ID starting with a8b81da80644c1d720dc70a86f12f1f3abae41572683feec13dce0e634282d3e not found: ID does not exist" containerID="a8b81da80644c1d720dc70a86f12f1f3abae41572683feec13dce0e634282d3e" Oct 01 13:13:57 crc kubenswrapper[4727]: I1001 13:13:57.500448 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8b81da80644c1d720dc70a86f12f1f3abae41572683feec13dce0e634282d3e"} err="failed to get container status \"a8b81da80644c1d720dc70a86f12f1f3abae41572683feec13dce0e634282d3e\": rpc error: code = NotFound desc = could not find container \"a8b81da80644c1d720dc70a86f12f1f3abae41572683feec13dce0e634282d3e\": container with ID starting with a8b81da80644c1d720dc70a86f12f1f3abae41572683feec13dce0e634282d3e not found: ID does not exist" Oct 01 13:13:57 crc kubenswrapper[4727]: I1001 13:13:57.500488 4727 scope.go:117] "RemoveContainer" containerID="a2877d0e2b7d6986f1fd5bb5dad39577482461175836fd3b0393f0143701b120" Oct 01 13:13:57 crc kubenswrapper[4727]: E1001 13:13:57.501405 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2877d0e2b7d6986f1fd5bb5dad39577482461175836fd3b0393f0143701b120\": container with ID starting with a2877d0e2b7d6986f1fd5bb5dad39577482461175836fd3b0393f0143701b120 not found: ID does not exist" containerID="a2877d0e2b7d6986f1fd5bb5dad39577482461175836fd3b0393f0143701b120" Oct 01 13:13:57 crc kubenswrapper[4727]: I1001 13:13:57.501441 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2877d0e2b7d6986f1fd5bb5dad39577482461175836fd3b0393f0143701b120"} err="failed to get container status \"a2877d0e2b7d6986f1fd5bb5dad39577482461175836fd3b0393f0143701b120\": rpc error: code = NotFound desc = could not find container \"a2877d0e2b7d6986f1fd5bb5dad39577482461175836fd3b0393f0143701b120\": container with ID starting with a2877d0e2b7d6986f1fd5bb5dad39577482461175836fd3b0393f0143701b120 not found: ID does not exist" Oct 01 13:13:58 crc kubenswrapper[4727]: I1001 13:13:58.383849 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd07db42-b958-40ac-bc17-5046c9619784" path="/var/lib/kubelet/pods/fd07db42-b958-40ac-bc17-5046c9619784/volumes" Oct 01 13:14:03 crc kubenswrapper[4727]: I1001 13:14:03.292113 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:14:03 crc kubenswrapper[4727]: I1001 13:14:03.292702 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:14:06 crc kubenswrapper[4727]: I1001 13:14:06.190059 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dwxn2"] Oct 01 13:14:06 crc kubenswrapper[4727]: E1001 13:14:06.190853 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd07db42-b958-40ac-bc17-5046c9619784" containerName="registry-server" Oct 01 13:14:06 crc kubenswrapper[4727]: I1001 13:14:06.190869 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd07db42-b958-40ac-bc17-5046c9619784" containerName="registry-server" Oct 01 13:14:06 crc kubenswrapper[4727]: E1001 13:14:06.190890 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd07db42-b958-40ac-bc17-5046c9619784" containerName="extract-utilities" Oct 01 13:14:06 crc kubenswrapper[4727]: I1001 13:14:06.190896 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd07db42-b958-40ac-bc17-5046c9619784" containerName="extract-utilities" Oct 01 13:14:06 crc kubenswrapper[4727]: E1001 13:14:06.190923 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd07db42-b958-40ac-bc17-5046c9619784" containerName="extract-content" Oct 01 13:14:06 crc kubenswrapper[4727]: I1001 13:14:06.190929 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd07db42-b958-40ac-bc17-5046c9619784" containerName="extract-content" Oct 01 13:14:06 crc kubenswrapper[4727]: I1001 13:14:06.191139 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd07db42-b958-40ac-bc17-5046c9619784" containerName="registry-server" Oct 01 13:14:06 crc kubenswrapper[4727]: I1001 13:14:06.192582 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dwxn2" Oct 01 13:14:06 crc kubenswrapper[4727]: I1001 13:14:06.203812 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dwxn2"] Oct 01 13:14:06 crc kubenswrapper[4727]: I1001 13:14:06.327897 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33fc9d09-977a-4903-99e5-d2c818ad8bfb-catalog-content\") pod \"certified-operators-dwxn2\" (UID: \"33fc9d09-977a-4903-99e5-d2c818ad8bfb\") " pod="openshift-marketplace/certified-operators-dwxn2" Oct 01 13:14:06 crc kubenswrapper[4727]: I1001 13:14:06.328615 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2ddj\" (UniqueName: \"kubernetes.io/projected/33fc9d09-977a-4903-99e5-d2c818ad8bfb-kube-api-access-g2ddj\") pod \"certified-operators-dwxn2\" (UID: \"33fc9d09-977a-4903-99e5-d2c818ad8bfb\") " pod="openshift-marketplace/certified-operators-dwxn2" Oct 01 13:14:06 crc kubenswrapper[4727]: I1001 13:14:06.328751 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33fc9d09-977a-4903-99e5-d2c818ad8bfb-utilities\") pod \"certified-operators-dwxn2\" (UID: \"33fc9d09-977a-4903-99e5-d2c818ad8bfb\") " pod="openshift-marketplace/certified-operators-dwxn2" Oct 01 13:14:06 crc kubenswrapper[4727]: I1001 13:14:06.430317 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2ddj\" (UniqueName: \"kubernetes.io/projected/33fc9d09-977a-4903-99e5-d2c818ad8bfb-kube-api-access-g2ddj\") pod \"certified-operators-dwxn2\" (UID: \"33fc9d09-977a-4903-99e5-d2c818ad8bfb\") " pod="openshift-marketplace/certified-operators-dwxn2" Oct 01 13:14:06 crc kubenswrapper[4727]: I1001 13:14:06.430381 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33fc9d09-977a-4903-99e5-d2c818ad8bfb-utilities\") pod \"certified-operators-dwxn2\" (UID: \"33fc9d09-977a-4903-99e5-d2c818ad8bfb\") " pod="openshift-marketplace/certified-operators-dwxn2" Oct 01 13:14:06 crc kubenswrapper[4727]: I1001 13:14:06.430471 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33fc9d09-977a-4903-99e5-d2c818ad8bfb-catalog-content\") pod \"certified-operators-dwxn2\" (UID: \"33fc9d09-977a-4903-99e5-d2c818ad8bfb\") " pod="openshift-marketplace/certified-operators-dwxn2" Oct 01 13:14:06 crc kubenswrapper[4727]: I1001 13:14:06.430954 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33fc9d09-977a-4903-99e5-d2c818ad8bfb-catalog-content\") pod \"certified-operators-dwxn2\" (UID: \"33fc9d09-977a-4903-99e5-d2c818ad8bfb\") " pod="openshift-marketplace/certified-operators-dwxn2" Oct 01 13:14:06 crc kubenswrapper[4727]: I1001 13:14:06.431151 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33fc9d09-977a-4903-99e5-d2c818ad8bfb-utilities\") pod \"certified-operators-dwxn2\" (UID: \"33fc9d09-977a-4903-99e5-d2c818ad8bfb\") " pod="openshift-marketplace/certified-operators-dwxn2" Oct 01 13:14:06 crc kubenswrapper[4727]: I1001 13:14:06.459871 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2ddj\" (UniqueName: \"kubernetes.io/projected/33fc9d09-977a-4903-99e5-d2c818ad8bfb-kube-api-access-g2ddj\") pod \"certified-operators-dwxn2\" (UID: \"33fc9d09-977a-4903-99e5-d2c818ad8bfb\") " pod="openshift-marketplace/certified-operators-dwxn2" Oct 01 13:14:06 crc kubenswrapper[4727]: I1001 13:14:06.515074 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dwxn2" Oct 01 13:14:07 crc kubenswrapper[4727]: I1001 13:14:07.000037 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dwxn2"] Oct 01 13:14:07 crc kubenswrapper[4727]: I1001 13:14:07.469250 4727 generic.go:334] "Generic (PLEG): container finished" podID="33fc9d09-977a-4903-99e5-d2c818ad8bfb" containerID="397b8d850b46492f2ba76f59fc51abb0e174a6e45e8b2f43bf57dcf51b38922d" exitCode=0 Oct 01 13:14:07 crc kubenswrapper[4727]: I1001 13:14:07.469589 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dwxn2" event={"ID":"33fc9d09-977a-4903-99e5-d2c818ad8bfb","Type":"ContainerDied","Data":"397b8d850b46492f2ba76f59fc51abb0e174a6e45e8b2f43bf57dcf51b38922d"} Oct 01 13:14:07 crc kubenswrapper[4727]: I1001 13:14:07.469970 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dwxn2" event={"ID":"33fc9d09-977a-4903-99e5-d2c818ad8bfb","Type":"ContainerStarted","Data":"a6913269f78f340d63bf7db5ad0e135b8d6e755ecec6adf55980710bf20a5026"} Oct 01 13:14:08 crc kubenswrapper[4727]: I1001 13:14:08.481922 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dwxn2" event={"ID":"33fc9d09-977a-4903-99e5-d2c818ad8bfb","Type":"ContainerStarted","Data":"be366e96cb237137e16df383afcb6f83f58d142b6aa7a2f43b97f08779317c55"} Oct 01 13:14:09 crc kubenswrapper[4727]: I1001 13:14:09.492845 4727 generic.go:334] "Generic (PLEG): container finished" podID="33fc9d09-977a-4903-99e5-d2c818ad8bfb" containerID="be366e96cb237137e16df383afcb6f83f58d142b6aa7a2f43b97f08779317c55" exitCode=0 Oct 01 13:14:09 crc kubenswrapper[4727]: I1001 13:14:09.493229 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dwxn2" event={"ID":"33fc9d09-977a-4903-99e5-d2c818ad8bfb","Type":"ContainerDied","Data":"be366e96cb237137e16df383afcb6f83f58d142b6aa7a2f43b97f08779317c55"} Oct 01 13:14:10 crc kubenswrapper[4727]: I1001 13:14:10.507903 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dwxn2" event={"ID":"33fc9d09-977a-4903-99e5-d2c818ad8bfb","Type":"ContainerStarted","Data":"e802a8d8b16d73932668f975744b7c5d2efe6d69ae85157282744e87553b0b5f"} Oct 01 13:14:10 crc kubenswrapper[4727]: I1001 13:14:10.531423 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dwxn2" podStartSLOduration=2.092201086 podStartE2EDuration="4.531404084s" podCreationTimestamp="2025-10-01 13:14:06 +0000 UTC" firstStartedPulling="2025-10-01 13:14:07.47120255 +0000 UTC m=+2225.792557387" lastFinishedPulling="2025-10-01 13:14:09.910405548 +0000 UTC m=+2228.231760385" observedRunningTime="2025-10-01 13:14:10.529374341 +0000 UTC m=+2228.850729198" watchObservedRunningTime="2025-10-01 13:14:10.531404084 +0000 UTC m=+2228.852758931" Oct 01 13:14:14 crc kubenswrapper[4727]: I1001 13:14:14.407754 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-7df68f6869-rwfcm" podUID="7c5e6c5d-4f10-437f-b20e-f3394093b3b9" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Oct 01 13:14:16 crc kubenswrapper[4727]: I1001 13:14:16.516249 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dwxn2" Oct 01 13:14:16 crc kubenswrapper[4727]: I1001 13:14:16.516658 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dwxn2" Oct 01 13:14:16 crc kubenswrapper[4727]: I1001 13:14:16.564058 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dwxn2" Oct 01 13:14:16 crc kubenswrapper[4727]: I1001 13:14:16.623533 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dwxn2" Oct 01 13:14:16 crc kubenswrapper[4727]: I1001 13:14:16.799334 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dwxn2"] Oct 01 13:14:18 crc kubenswrapper[4727]: I1001 13:14:18.590308 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dwxn2" podUID="33fc9d09-977a-4903-99e5-d2c818ad8bfb" containerName="registry-server" containerID="cri-o://e802a8d8b16d73932668f975744b7c5d2efe6d69ae85157282744e87553b0b5f" gracePeriod=2 Oct 01 13:14:19 crc kubenswrapper[4727]: I1001 13:14:19.057692 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dwxn2" Oct 01 13:14:19 crc kubenswrapper[4727]: I1001 13:14:19.092808 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33fc9d09-977a-4903-99e5-d2c818ad8bfb-utilities\") pod \"33fc9d09-977a-4903-99e5-d2c818ad8bfb\" (UID: \"33fc9d09-977a-4903-99e5-d2c818ad8bfb\") " Oct 01 13:14:19 crc kubenswrapper[4727]: I1001 13:14:19.092904 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2ddj\" (UniqueName: \"kubernetes.io/projected/33fc9d09-977a-4903-99e5-d2c818ad8bfb-kube-api-access-g2ddj\") pod \"33fc9d09-977a-4903-99e5-d2c818ad8bfb\" (UID: \"33fc9d09-977a-4903-99e5-d2c818ad8bfb\") " Oct 01 13:14:19 crc kubenswrapper[4727]: I1001 13:14:19.092938 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33fc9d09-977a-4903-99e5-d2c818ad8bfb-catalog-content\") pod \"33fc9d09-977a-4903-99e5-d2c818ad8bfb\" (UID: \"33fc9d09-977a-4903-99e5-d2c818ad8bfb\") " Oct 01 13:14:19 crc kubenswrapper[4727]: I1001 13:14:19.094583 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33fc9d09-977a-4903-99e5-d2c818ad8bfb-utilities" (OuterVolumeSpecName: "utilities") pod "33fc9d09-977a-4903-99e5-d2c818ad8bfb" (UID: "33fc9d09-977a-4903-99e5-d2c818ad8bfb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:14:19 crc kubenswrapper[4727]: I1001 13:14:19.103061 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33fc9d09-977a-4903-99e5-d2c818ad8bfb-kube-api-access-g2ddj" (OuterVolumeSpecName: "kube-api-access-g2ddj") pod "33fc9d09-977a-4903-99e5-d2c818ad8bfb" (UID: "33fc9d09-977a-4903-99e5-d2c818ad8bfb"). InnerVolumeSpecName "kube-api-access-g2ddj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:14:19 crc kubenswrapper[4727]: I1001 13:14:19.194205 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33fc9d09-977a-4903-99e5-d2c818ad8bfb-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:19 crc kubenswrapper[4727]: I1001 13:14:19.194242 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2ddj\" (UniqueName: \"kubernetes.io/projected/33fc9d09-977a-4903-99e5-d2c818ad8bfb-kube-api-access-g2ddj\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:19 crc kubenswrapper[4727]: I1001 13:14:19.547322 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33fc9d09-977a-4903-99e5-d2c818ad8bfb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33fc9d09-977a-4903-99e5-d2c818ad8bfb" (UID: "33fc9d09-977a-4903-99e5-d2c818ad8bfb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:14:19 crc kubenswrapper[4727]: I1001 13:14:19.602846 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33fc9d09-977a-4903-99e5-d2c818ad8bfb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:14:19 crc kubenswrapper[4727]: I1001 13:14:19.603498 4727 generic.go:334] "Generic (PLEG): container finished" podID="33fc9d09-977a-4903-99e5-d2c818ad8bfb" containerID="e802a8d8b16d73932668f975744b7c5d2efe6d69ae85157282744e87553b0b5f" exitCode=0 Oct 01 13:14:19 crc kubenswrapper[4727]: I1001 13:14:19.603544 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dwxn2" event={"ID":"33fc9d09-977a-4903-99e5-d2c818ad8bfb","Type":"ContainerDied","Data":"e802a8d8b16d73932668f975744b7c5d2efe6d69ae85157282744e87553b0b5f"} Oct 01 13:14:19 crc kubenswrapper[4727]: I1001 13:14:19.603560 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dwxn2" Oct 01 13:14:19 crc kubenswrapper[4727]: I1001 13:14:19.603604 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dwxn2" event={"ID":"33fc9d09-977a-4903-99e5-d2c818ad8bfb","Type":"ContainerDied","Data":"a6913269f78f340d63bf7db5ad0e135b8d6e755ecec6adf55980710bf20a5026"} Oct 01 13:14:19 crc kubenswrapper[4727]: I1001 13:14:19.603627 4727 scope.go:117] "RemoveContainer" containerID="e802a8d8b16d73932668f975744b7c5d2efe6d69ae85157282744e87553b0b5f" Oct 01 13:14:19 crc kubenswrapper[4727]: I1001 13:14:19.628654 4727 scope.go:117] "RemoveContainer" containerID="be366e96cb237137e16df383afcb6f83f58d142b6aa7a2f43b97f08779317c55" Oct 01 13:14:19 crc kubenswrapper[4727]: I1001 13:14:19.643969 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dwxn2"] Oct 01 13:14:19 crc kubenswrapper[4727]: I1001 13:14:19.652475 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dwxn2"] Oct 01 13:14:19 crc kubenswrapper[4727]: I1001 13:14:19.676776 4727 scope.go:117] "RemoveContainer" containerID="397b8d850b46492f2ba76f59fc51abb0e174a6e45e8b2f43bf57dcf51b38922d" Oct 01 13:14:19 crc kubenswrapper[4727]: I1001 13:14:19.693494 4727 scope.go:117] "RemoveContainer" containerID="e802a8d8b16d73932668f975744b7c5d2efe6d69ae85157282744e87553b0b5f" Oct 01 13:14:19 crc kubenswrapper[4727]: E1001 13:14:19.693879 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e802a8d8b16d73932668f975744b7c5d2efe6d69ae85157282744e87553b0b5f\": container with ID starting with e802a8d8b16d73932668f975744b7c5d2efe6d69ae85157282744e87553b0b5f not found: ID does not exist" containerID="e802a8d8b16d73932668f975744b7c5d2efe6d69ae85157282744e87553b0b5f" Oct 01 13:14:19 crc kubenswrapper[4727]: I1001 13:14:19.693907 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e802a8d8b16d73932668f975744b7c5d2efe6d69ae85157282744e87553b0b5f"} err="failed to get container status \"e802a8d8b16d73932668f975744b7c5d2efe6d69ae85157282744e87553b0b5f\": rpc error: code = NotFound desc = could not find container \"e802a8d8b16d73932668f975744b7c5d2efe6d69ae85157282744e87553b0b5f\": container with ID starting with e802a8d8b16d73932668f975744b7c5d2efe6d69ae85157282744e87553b0b5f not found: ID does not exist" Oct 01 13:14:19 crc kubenswrapper[4727]: I1001 13:14:19.693930 4727 scope.go:117] "RemoveContainer" containerID="be366e96cb237137e16df383afcb6f83f58d142b6aa7a2f43b97f08779317c55" Oct 01 13:14:19 crc kubenswrapper[4727]: E1001 13:14:19.694292 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be366e96cb237137e16df383afcb6f83f58d142b6aa7a2f43b97f08779317c55\": container with ID starting with be366e96cb237137e16df383afcb6f83f58d142b6aa7a2f43b97f08779317c55 not found: ID does not exist" containerID="be366e96cb237137e16df383afcb6f83f58d142b6aa7a2f43b97f08779317c55" Oct 01 13:14:19 crc kubenswrapper[4727]: I1001 13:14:19.694342 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be366e96cb237137e16df383afcb6f83f58d142b6aa7a2f43b97f08779317c55"} err="failed to get container status \"be366e96cb237137e16df383afcb6f83f58d142b6aa7a2f43b97f08779317c55\": rpc error: code = NotFound desc = could not find container \"be366e96cb237137e16df383afcb6f83f58d142b6aa7a2f43b97f08779317c55\": container with ID starting with be366e96cb237137e16df383afcb6f83f58d142b6aa7a2f43b97f08779317c55 not found: ID does not exist" Oct 01 13:14:19 crc kubenswrapper[4727]: I1001 13:14:19.694361 4727 scope.go:117] "RemoveContainer" containerID="397b8d850b46492f2ba76f59fc51abb0e174a6e45e8b2f43bf57dcf51b38922d" Oct 01 13:14:19 crc kubenswrapper[4727]: E1001 13:14:19.694550 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"397b8d850b46492f2ba76f59fc51abb0e174a6e45e8b2f43bf57dcf51b38922d\": container with ID starting with 397b8d850b46492f2ba76f59fc51abb0e174a6e45e8b2f43bf57dcf51b38922d not found: ID does not exist" containerID="397b8d850b46492f2ba76f59fc51abb0e174a6e45e8b2f43bf57dcf51b38922d" Oct 01 13:14:19 crc kubenswrapper[4727]: I1001 13:14:19.694575 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"397b8d850b46492f2ba76f59fc51abb0e174a6e45e8b2f43bf57dcf51b38922d"} err="failed to get container status \"397b8d850b46492f2ba76f59fc51abb0e174a6e45e8b2f43bf57dcf51b38922d\": rpc error: code = NotFound desc = could not find container \"397b8d850b46492f2ba76f59fc51abb0e174a6e45e8b2f43bf57dcf51b38922d\": container with ID starting with 397b8d850b46492f2ba76f59fc51abb0e174a6e45e8b2f43bf57dcf51b38922d not found: ID does not exist" Oct 01 13:14:20 crc kubenswrapper[4727]: I1001 13:14:20.383370 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33fc9d09-977a-4903-99e5-d2c818ad8bfb" path="/var/lib/kubelet/pods/33fc9d09-977a-4903-99e5-d2c818ad8bfb/volumes" Oct 01 13:14:33 crc kubenswrapper[4727]: I1001 13:14:33.291667 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:14:33 crc kubenswrapper[4727]: I1001 13:14:33.292206 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:15:00 crc kubenswrapper[4727]: I1001 13:15:00.157473 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322075-gpx5s"] Oct 01 13:15:00 crc kubenswrapper[4727]: E1001 13:15:00.159736 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33fc9d09-977a-4903-99e5-d2c818ad8bfb" containerName="extract-content" Oct 01 13:15:00 crc kubenswrapper[4727]: I1001 13:15:00.159847 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="33fc9d09-977a-4903-99e5-d2c818ad8bfb" containerName="extract-content" Oct 01 13:15:00 crc kubenswrapper[4727]: E1001 13:15:00.159968 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33fc9d09-977a-4903-99e5-d2c818ad8bfb" containerName="extract-utilities" Oct 01 13:15:00 crc kubenswrapper[4727]: I1001 13:15:00.160080 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="33fc9d09-977a-4903-99e5-d2c818ad8bfb" containerName="extract-utilities" Oct 01 13:15:00 crc kubenswrapper[4727]: E1001 13:15:00.160172 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33fc9d09-977a-4903-99e5-d2c818ad8bfb" containerName="registry-server" Oct 01 13:15:00 crc kubenswrapper[4727]: I1001 13:15:00.160253 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="33fc9d09-977a-4903-99e5-d2c818ad8bfb" containerName="registry-server" Oct 01 13:15:00 crc kubenswrapper[4727]: I1001 13:15:00.160587 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="33fc9d09-977a-4903-99e5-d2c818ad8bfb" containerName="registry-server" Oct 01 13:15:00 crc kubenswrapper[4727]: I1001 13:15:00.162048 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-gpx5s" Oct 01 13:15:00 crc kubenswrapper[4727]: I1001 13:15:00.164614 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322075-gpx5s"] Oct 01 13:15:00 crc kubenswrapper[4727]: I1001 13:15:00.165327 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 13:15:00 crc kubenswrapper[4727]: I1001 13:15:00.165762 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 13:15:00 crc kubenswrapper[4727]: I1001 13:15:00.189267 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8bms\" (UniqueName: \"kubernetes.io/projected/fa6b8e5f-3f7e-46c3-a043-831a31855fe3-kube-api-access-z8bms\") pod \"collect-profiles-29322075-gpx5s\" (UID: \"fa6b8e5f-3f7e-46c3-a043-831a31855fe3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-gpx5s" Oct 01 13:15:00 crc kubenswrapper[4727]: I1001 13:15:00.189398 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa6b8e5f-3f7e-46c3-a043-831a31855fe3-secret-volume\") pod \"collect-profiles-29322075-gpx5s\" (UID: \"fa6b8e5f-3f7e-46c3-a043-831a31855fe3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-gpx5s" Oct 01 13:15:00 crc kubenswrapper[4727]: I1001 13:15:00.189448 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa6b8e5f-3f7e-46c3-a043-831a31855fe3-config-volume\") pod \"collect-profiles-29322075-gpx5s\" (UID: \"fa6b8e5f-3f7e-46c3-a043-831a31855fe3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-gpx5s" Oct 01 13:15:00 crc kubenswrapper[4727]: I1001 13:15:00.291734 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa6b8e5f-3f7e-46c3-a043-831a31855fe3-config-volume\") pod \"collect-profiles-29322075-gpx5s\" (UID: \"fa6b8e5f-3f7e-46c3-a043-831a31855fe3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-gpx5s" Oct 01 13:15:00 crc kubenswrapper[4727]: I1001 13:15:00.292275 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8bms\" (UniqueName: \"kubernetes.io/projected/fa6b8e5f-3f7e-46c3-a043-831a31855fe3-kube-api-access-z8bms\") pod \"collect-profiles-29322075-gpx5s\" (UID: \"fa6b8e5f-3f7e-46c3-a043-831a31855fe3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-gpx5s" Oct 01 13:15:00 crc kubenswrapper[4727]: I1001 13:15:00.292640 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa6b8e5f-3f7e-46c3-a043-831a31855fe3-secret-volume\") pod \"collect-profiles-29322075-gpx5s\" (UID: \"fa6b8e5f-3f7e-46c3-a043-831a31855fe3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-gpx5s" Oct 01 13:15:00 crc kubenswrapper[4727]: I1001 13:15:00.292871 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa6b8e5f-3f7e-46c3-a043-831a31855fe3-config-volume\") pod \"collect-profiles-29322075-gpx5s\" (UID: \"fa6b8e5f-3f7e-46c3-a043-831a31855fe3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-gpx5s" Oct 01 13:15:00 crc kubenswrapper[4727]: I1001 13:15:00.299316 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa6b8e5f-3f7e-46c3-a043-831a31855fe3-secret-volume\") pod \"collect-profiles-29322075-gpx5s\" (UID: \"fa6b8e5f-3f7e-46c3-a043-831a31855fe3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-gpx5s" Oct 01 13:15:00 crc kubenswrapper[4727]: I1001 13:15:00.310629 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8bms\" (UniqueName: \"kubernetes.io/projected/fa6b8e5f-3f7e-46c3-a043-831a31855fe3-kube-api-access-z8bms\") pod \"collect-profiles-29322075-gpx5s\" (UID: \"fa6b8e5f-3f7e-46c3-a043-831a31855fe3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-gpx5s" Oct 01 13:15:00 crc kubenswrapper[4727]: I1001 13:15:00.496044 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-gpx5s" Oct 01 13:15:01 crc kubenswrapper[4727]: I1001 13:15:01.093504 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322075-gpx5s"] Oct 01 13:15:02 crc kubenswrapper[4727]: I1001 13:15:02.004679 4727 generic.go:334] "Generic (PLEG): container finished" podID="fa6b8e5f-3f7e-46c3-a043-831a31855fe3" containerID="c98582034f7f849db6a83aa059cbd06dae2d7049f38a89477e1f28d9d66a4374" exitCode=0 Oct 01 13:15:02 crc kubenswrapper[4727]: I1001 13:15:02.004905 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-gpx5s" event={"ID":"fa6b8e5f-3f7e-46c3-a043-831a31855fe3","Type":"ContainerDied","Data":"c98582034f7f849db6a83aa059cbd06dae2d7049f38a89477e1f28d9d66a4374"} Oct 01 13:15:02 crc kubenswrapper[4727]: I1001 13:15:02.004935 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-gpx5s" event={"ID":"fa6b8e5f-3f7e-46c3-a043-831a31855fe3","Type":"ContainerStarted","Data":"5857f921843203afee09032d42901db43660a8119e68b8894338027b4de9ad12"} Oct 01 13:15:03 crc kubenswrapper[4727]: I1001 13:15:03.292896 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:15:03 crc kubenswrapper[4727]: I1001 13:15:03.293450 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:15:03 crc kubenswrapper[4727]: I1001 13:15:03.293508 4727 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" Oct 01 13:15:03 crc kubenswrapper[4727]: I1001 13:15:03.294330 4727 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2"} pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:15:03 crc kubenswrapper[4727]: I1001 13:15:03.294381 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" containerID="cri-o://fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2" gracePeriod=600 Oct 01 13:15:03 crc kubenswrapper[4727]: I1001 13:15:03.375877 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-gpx5s" Oct 01 13:15:03 crc kubenswrapper[4727]: E1001 13:15:03.419156 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:15:03 crc kubenswrapper[4727]: I1001 13:15:03.558752 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa6b8e5f-3f7e-46c3-a043-831a31855fe3-secret-volume\") pod \"fa6b8e5f-3f7e-46c3-a043-831a31855fe3\" (UID: \"fa6b8e5f-3f7e-46c3-a043-831a31855fe3\") " Oct 01 13:15:03 crc kubenswrapper[4727]: I1001 13:15:03.558918 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa6b8e5f-3f7e-46c3-a043-831a31855fe3-config-volume\") pod \"fa6b8e5f-3f7e-46c3-a043-831a31855fe3\" (UID: \"fa6b8e5f-3f7e-46c3-a043-831a31855fe3\") " Oct 01 13:15:03 crc kubenswrapper[4727]: I1001 13:15:03.559511 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa6b8e5f-3f7e-46c3-a043-831a31855fe3-config-volume" (OuterVolumeSpecName: "config-volume") pod "fa6b8e5f-3f7e-46c3-a043-831a31855fe3" (UID: "fa6b8e5f-3f7e-46c3-a043-831a31855fe3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:15:03 crc kubenswrapper[4727]: I1001 13:15:03.559624 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8bms\" (UniqueName: \"kubernetes.io/projected/fa6b8e5f-3f7e-46c3-a043-831a31855fe3-kube-api-access-z8bms\") pod \"fa6b8e5f-3f7e-46c3-a043-831a31855fe3\" (UID: \"fa6b8e5f-3f7e-46c3-a043-831a31855fe3\") " Oct 01 13:15:03 crc kubenswrapper[4727]: I1001 13:15:03.561451 4727 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa6b8e5f-3f7e-46c3-a043-831a31855fe3-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:03 crc kubenswrapper[4727]: I1001 13:15:03.566378 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa6b8e5f-3f7e-46c3-a043-831a31855fe3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fa6b8e5f-3f7e-46c3-a043-831a31855fe3" (UID: "fa6b8e5f-3f7e-46c3-a043-831a31855fe3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:03 crc kubenswrapper[4727]: I1001 13:15:03.568513 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa6b8e5f-3f7e-46c3-a043-831a31855fe3-kube-api-access-z8bms" (OuterVolumeSpecName: "kube-api-access-z8bms") pod "fa6b8e5f-3f7e-46c3-a043-831a31855fe3" (UID: "fa6b8e5f-3f7e-46c3-a043-831a31855fe3"). InnerVolumeSpecName "kube-api-access-z8bms". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:15:03 crc kubenswrapper[4727]: I1001 13:15:03.663306 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8bms\" (UniqueName: \"kubernetes.io/projected/fa6b8e5f-3f7e-46c3-a043-831a31855fe3-kube-api-access-z8bms\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:03 crc kubenswrapper[4727]: I1001 13:15:03.663692 4727 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fa6b8e5f-3f7e-46c3-a043-831a31855fe3-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:04 crc kubenswrapper[4727]: I1001 13:15:04.027333 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-gpx5s" Oct 01 13:15:04 crc kubenswrapper[4727]: I1001 13:15:04.027330 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-gpx5s" event={"ID":"fa6b8e5f-3f7e-46c3-a043-831a31855fe3","Type":"ContainerDied","Data":"5857f921843203afee09032d42901db43660a8119e68b8894338027b4de9ad12"} Oct 01 13:15:04 crc kubenswrapper[4727]: I1001 13:15:04.027404 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5857f921843203afee09032d42901db43660a8119e68b8894338027b4de9ad12" Oct 01 13:15:04 crc kubenswrapper[4727]: I1001 13:15:04.039698 4727 generic.go:334] "Generic (PLEG): container finished" podID="d18290ae-64a5-44a5-a704-90977d85852b" containerID="fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2" exitCode=0 Oct 01 13:15:04 crc kubenswrapper[4727]: I1001 13:15:04.039796 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" event={"ID":"d18290ae-64a5-44a5-a704-90977d85852b","Type":"ContainerDied","Data":"fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2"} Oct 01 13:15:04 crc kubenswrapper[4727]: I1001 13:15:04.039908 4727 scope.go:117] "RemoveContainer" containerID="ffe3e19b0829e296a5017db986302c2eb85d3b8446c095789e9f37c908e4271f" Oct 01 13:15:04 crc kubenswrapper[4727]: I1001 13:15:04.043169 4727 scope.go:117] "RemoveContainer" containerID="fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2" Oct 01 13:15:04 crc kubenswrapper[4727]: E1001 13:15:04.043966 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:15:04 crc kubenswrapper[4727]: I1001 13:15:04.468886 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322030-zjx2n"] Oct 01 13:15:04 crc kubenswrapper[4727]: I1001 13:15:04.479344 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322030-zjx2n"] Oct 01 13:15:06 crc kubenswrapper[4727]: I1001 13:15:06.384409 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1974282f-c2f4-48cd-97e2-9e880203ef1c" path="/var/lib/kubelet/pods/1974282f-c2f4-48cd-97e2-9e880203ef1c/volumes" Oct 01 13:15:11 crc kubenswrapper[4727]: I1001 13:15:11.729467 4727 scope.go:117] "RemoveContainer" containerID="d9fb1e4352d7559f140d9f5fdb64bd79ffc4b43f2e90b5d94ac7269b7e83d9b5" Oct 01 13:15:14 crc kubenswrapper[4727]: I1001 13:15:14.373123 4727 scope.go:117] "RemoveContainer" containerID="fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2" Oct 01 13:15:14 crc kubenswrapper[4727]: E1001 13:15:14.373711 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:15:24 crc kubenswrapper[4727]: I1001 13:15:24.247124 4727 generic.go:334] "Generic (PLEG): container finished" podID="a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d" containerID="e77f97a548fd927d70323ea4afd8cd99d10a487fe2d1faffc9516515f772cfb4" exitCode=0 Oct 01 13:15:24 crc kubenswrapper[4727]: I1001 13:15:24.247236 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm" event={"ID":"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d","Type":"ContainerDied","Data":"e77f97a548fd927d70323ea4afd8cd99d10a487fe2d1faffc9516515f772cfb4"} Oct 01 13:15:25 crc kubenswrapper[4727]: I1001 13:15:25.654386 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm" Oct 01 13:15:25 crc kubenswrapper[4727]: I1001 13:15:25.701322 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpfhk\" (UniqueName: \"kubernetes.io/projected/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-kube-api-access-bpfhk\") pod \"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d\" (UID: \"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d\") " Oct 01 13:15:25 crc kubenswrapper[4727]: I1001 13:15:25.701535 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-ssh-key\") pod \"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d\" (UID: \"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d\") " Oct 01 13:15:25 crc kubenswrapper[4727]: I1001 13:15:25.701601 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-inventory\") pod \"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d\" (UID: \"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d\") " Oct 01 13:15:25 crc kubenswrapper[4727]: I1001 13:15:25.701697 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-libvirt-combined-ca-bundle\") pod \"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d\" (UID: \"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d\") " Oct 01 13:15:25 crc kubenswrapper[4727]: I1001 13:15:25.701761 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-libvirt-secret-0\") pod \"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d\" (UID: \"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d\") " Oct 01 13:15:25 crc kubenswrapper[4727]: I1001 13:15:25.706226 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d" (UID: "a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:25 crc kubenswrapper[4727]: I1001 13:15:25.708612 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-kube-api-access-bpfhk" (OuterVolumeSpecName: "kube-api-access-bpfhk") pod "a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d" (UID: "a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d"). InnerVolumeSpecName "kube-api-access-bpfhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:15:25 crc kubenswrapper[4727]: I1001 13:15:25.728578 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d" (UID: "a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:25 crc kubenswrapper[4727]: I1001 13:15:25.731089 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d" (UID: "a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:25 crc kubenswrapper[4727]: I1001 13:15:25.731509 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-inventory" (OuterVolumeSpecName: "inventory") pod "a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d" (UID: "a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:25 crc kubenswrapper[4727]: I1001 13:15:25.804263 4727 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:25 crc kubenswrapper[4727]: I1001 13:15:25.804303 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpfhk\" (UniqueName: \"kubernetes.io/projected/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-kube-api-access-bpfhk\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:25 crc kubenswrapper[4727]: I1001 13:15:25.804314 4727 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:25 crc kubenswrapper[4727]: I1001 13:15:25.804325 4727 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:25 crc kubenswrapper[4727]: I1001 13:15:25.804333 4727 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.266044 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm" event={"ID":"a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d","Type":"ContainerDied","Data":"e99134c3ea57d80b996fa1ef9489d3152498826ecb6daff639d87a43d2e54f16"} Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.266390 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e99134c3ea57d80b996fa1ef9489d3152498826ecb6daff639d87a43d2e54f16" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.266109 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.371343 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767"] Oct 01 13:15:26 crc kubenswrapper[4727]: E1001 13:15:26.371783 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.371808 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 01 13:15:26 crc kubenswrapper[4727]: E1001 13:15:26.371856 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6b8e5f-3f7e-46c3-a043-831a31855fe3" containerName="collect-profiles" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.371864 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6b8e5f-3f7e-46c3-a043-831a31855fe3" containerName="collect-profiles" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.372106 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa6b8e5f-3f7e-46c3-a043-831a31855fe3" containerName="collect-profiles" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.372138 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.373025 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.375562 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.378250 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.378363 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jcjb6" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.378700 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.378874 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.379073 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.392336 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767"] Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.422587 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.529100 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9b767\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.529160 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9b767\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.529198 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9b767\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.529222 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm9dp\" (UniqueName: \"kubernetes.io/projected/9622021e-ef0b-4274-a356-f61405a2dd9b-kube-api-access-cm9dp\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9b767\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.529267 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9b767\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.529336 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9b767\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.529522 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9b767\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.529649 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9b767\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.529685 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9b767\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.631316 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9b767\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.631485 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9b767\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.631513 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9b767\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.631669 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9b767\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.631758 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9b767\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.631872 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9b767\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.631958 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm9dp\" (UniqueName: \"kubernetes.io/projected/9622021e-ef0b-4274-a356-f61405a2dd9b-kube-api-access-cm9dp\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9b767\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.632298 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9b767\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.632460 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9b767\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.633474 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9b767\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.638250 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9b767\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.638266 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9b767\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.638902 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9b767\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.640630 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9b767\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.640844 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9b767\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.641751 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9b767\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.642486 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9b767\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.655951 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm9dp\" (UniqueName: \"kubernetes.io/projected/9622021e-ef0b-4274-a356-f61405a2dd9b-kube-api-access-cm9dp\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9b767\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:26 crc kubenswrapper[4727]: I1001 13:15:26.752479 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:15:27 crc kubenswrapper[4727]: I1001 13:15:27.266029 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767"] Oct 01 13:15:27 crc kubenswrapper[4727]: W1001 13:15:27.269839 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9622021e_ef0b_4274_a356_f61405a2dd9b.slice/crio-60efc73cf876f3e59633ff23042e38e9c044c65a274f58ee8f181eaca7dd38b6 WatchSource:0}: Error finding container 60efc73cf876f3e59633ff23042e38e9c044c65a274f58ee8f181eaca7dd38b6: Status 404 returned error can't find the container with id 60efc73cf876f3e59633ff23042e38e9c044c65a274f58ee8f181eaca7dd38b6 Oct 01 13:15:27 crc kubenswrapper[4727]: I1001 13:15:27.372148 4727 scope.go:117] "RemoveContainer" containerID="fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2" Oct 01 13:15:27 crc kubenswrapper[4727]: E1001 13:15:27.372439 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:15:28 crc kubenswrapper[4727]: I1001 13:15:28.286454 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" event={"ID":"9622021e-ef0b-4274-a356-f61405a2dd9b","Type":"ContainerStarted","Data":"60efc73cf876f3e59633ff23042e38e9c044c65a274f58ee8f181eaca7dd38b6"} Oct 01 13:15:29 crc kubenswrapper[4727]: I1001 13:15:29.296129 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" event={"ID":"9622021e-ef0b-4274-a356-f61405a2dd9b","Type":"ContainerStarted","Data":"82b14691c3f443a214f9fb8e079fe34278ae7a54f821c02ffc67ee23a52c619b"} Oct 01 13:15:29 crc kubenswrapper[4727]: I1001 13:15:29.322067 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" podStartSLOduration=2.160142709 podStartE2EDuration="3.322036271s" podCreationTimestamp="2025-10-01 13:15:26 +0000 UTC" firstStartedPulling="2025-10-01 13:15:27.273036256 +0000 UTC m=+2305.594391093" lastFinishedPulling="2025-10-01 13:15:28.434929808 +0000 UTC m=+2306.756284655" observedRunningTime="2025-10-01 13:15:29.312331056 +0000 UTC m=+2307.633685903" watchObservedRunningTime="2025-10-01 13:15:29.322036271 +0000 UTC m=+2307.643391108" Oct 01 13:15:41 crc kubenswrapper[4727]: I1001 13:15:41.373081 4727 scope.go:117] "RemoveContainer" containerID="fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2" Oct 01 13:15:41 crc kubenswrapper[4727]: E1001 13:15:41.373825 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:15:56 crc kubenswrapper[4727]: I1001 13:15:56.372628 4727 scope.go:117] "RemoveContainer" containerID="fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2" Oct 01 13:15:56 crc kubenswrapper[4727]: E1001 13:15:56.373288 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:16:08 crc kubenswrapper[4727]: I1001 13:16:08.372809 4727 scope.go:117] "RemoveContainer" containerID="fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2" Oct 01 13:16:08 crc kubenswrapper[4727]: E1001 13:16:08.374165 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:16:23 crc kubenswrapper[4727]: I1001 13:16:23.372163 4727 scope.go:117] "RemoveContainer" containerID="fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2" Oct 01 13:16:23 crc kubenswrapper[4727]: E1001 13:16:23.373101 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:16:36 crc kubenswrapper[4727]: I1001 13:16:36.372187 4727 scope.go:117] "RemoveContainer" containerID="fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2" Oct 01 13:16:36 crc kubenswrapper[4727]: E1001 13:16:36.372902 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:16:51 crc kubenswrapper[4727]: I1001 13:16:51.371978 4727 scope.go:117] "RemoveContainer" containerID="fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2" Oct 01 13:16:51 crc kubenswrapper[4727]: E1001 13:16:51.373975 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:17:05 crc kubenswrapper[4727]: I1001 13:17:05.373102 4727 scope.go:117] "RemoveContainer" containerID="fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2" Oct 01 13:17:05 crc kubenswrapper[4727]: E1001 13:17:05.375264 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:17:17 crc kubenswrapper[4727]: I1001 13:17:17.374096 4727 scope.go:117] "RemoveContainer" containerID="fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2" Oct 01 13:17:17 crc kubenswrapper[4727]: E1001 13:17:17.374960 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:17:29 crc kubenswrapper[4727]: I1001 13:17:29.372848 4727 scope.go:117] "RemoveContainer" containerID="fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2" Oct 01 13:17:29 crc kubenswrapper[4727]: E1001 13:17:29.373683 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:17:42 crc kubenswrapper[4727]: I1001 13:17:42.378134 4727 scope.go:117] "RemoveContainer" containerID="fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2" Oct 01 13:17:42 crc kubenswrapper[4727]: E1001 13:17:42.378919 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:17:56 crc kubenswrapper[4727]: I1001 13:17:56.373908 4727 scope.go:117] "RemoveContainer" containerID="fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2" Oct 01 13:17:56 crc kubenswrapper[4727]: E1001 13:17:56.374971 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:18:10 crc kubenswrapper[4727]: I1001 13:18:10.372859 4727 scope.go:117] "RemoveContainer" containerID="fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2" Oct 01 13:18:10 crc kubenswrapper[4727]: E1001 13:18:10.373747 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:18:24 crc kubenswrapper[4727]: I1001 13:18:24.373207 4727 scope.go:117] "RemoveContainer" containerID="fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2" Oct 01 13:18:24 crc kubenswrapper[4727]: E1001 13:18:24.374025 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:18:35 crc kubenswrapper[4727]: I1001 13:18:35.372495 4727 scope.go:117] "RemoveContainer" containerID="fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2" Oct 01 13:18:35 crc kubenswrapper[4727]: E1001 13:18:35.373482 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:18:50 crc kubenswrapper[4727]: I1001 13:18:50.372239 4727 scope.go:117] "RemoveContainer" containerID="fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2" Oct 01 13:18:50 crc kubenswrapper[4727]: E1001 13:18:50.372957 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:18:59 crc kubenswrapper[4727]: I1001 13:18:59.300460 4727 generic.go:334] "Generic (PLEG): container finished" podID="9622021e-ef0b-4274-a356-f61405a2dd9b" containerID="82b14691c3f443a214f9fb8e079fe34278ae7a54f821c02ffc67ee23a52c619b" exitCode=0 Oct 01 13:18:59 crc kubenswrapper[4727]: I1001 13:18:59.300526 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" event={"ID":"9622021e-ef0b-4274-a356-f61405a2dd9b","Type":"ContainerDied","Data":"82b14691c3f443a214f9fb8e079fe34278ae7a54f821c02ffc67ee23a52c619b"} Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.779216 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.879707 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-combined-ca-bundle\") pod \"9622021e-ef0b-4274-a356-f61405a2dd9b\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.879775 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-extra-config-0\") pod \"9622021e-ef0b-4274-a356-f61405a2dd9b\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.879818 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-ssh-key\") pod \"9622021e-ef0b-4274-a356-f61405a2dd9b\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.879865 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm9dp\" (UniqueName: \"kubernetes.io/projected/9622021e-ef0b-4274-a356-f61405a2dd9b-kube-api-access-cm9dp\") pod \"9622021e-ef0b-4274-a356-f61405a2dd9b\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.879894 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-inventory\") pod \"9622021e-ef0b-4274-a356-f61405a2dd9b\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.879920 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-cell1-compute-config-0\") pod \"9622021e-ef0b-4274-a356-f61405a2dd9b\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.879948 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-cell1-compute-config-1\") pod \"9622021e-ef0b-4274-a356-f61405a2dd9b\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.880029 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-migration-ssh-key-0\") pod \"9622021e-ef0b-4274-a356-f61405a2dd9b\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.880138 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-migration-ssh-key-1\") pod \"9622021e-ef0b-4274-a356-f61405a2dd9b\" (UID: \"9622021e-ef0b-4274-a356-f61405a2dd9b\") " Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.889158 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9622021e-ef0b-4274-a356-f61405a2dd9b" (UID: "9622021e-ef0b-4274-a356-f61405a2dd9b"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.889635 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9622021e-ef0b-4274-a356-f61405a2dd9b-kube-api-access-cm9dp" (OuterVolumeSpecName: "kube-api-access-cm9dp") pod "9622021e-ef0b-4274-a356-f61405a2dd9b" (UID: "9622021e-ef0b-4274-a356-f61405a2dd9b"). InnerVolumeSpecName "kube-api-access-cm9dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.910068 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "9622021e-ef0b-4274-a356-f61405a2dd9b" (UID: "9622021e-ef0b-4274-a356-f61405a2dd9b"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.915233 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-inventory" (OuterVolumeSpecName: "inventory") pod "9622021e-ef0b-4274-a356-f61405a2dd9b" (UID: "9622021e-ef0b-4274-a356-f61405a2dd9b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.918950 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "9622021e-ef0b-4274-a356-f61405a2dd9b" (UID: "9622021e-ef0b-4274-a356-f61405a2dd9b"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.920131 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "9622021e-ef0b-4274-a356-f61405a2dd9b" (UID: "9622021e-ef0b-4274-a356-f61405a2dd9b"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.924363 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "9622021e-ef0b-4274-a356-f61405a2dd9b" (UID: "9622021e-ef0b-4274-a356-f61405a2dd9b"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.925537 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9622021e-ef0b-4274-a356-f61405a2dd9b" (UID: "9622021e-ef0b-4274-a356-f61405a2dd9b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.929227 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "9622021e-ef0b-4274-a356-f61405a2dd9b" (UID: "9622021e-ef0b-4274-a356-f61405a2dd9b"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.982180 4727 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.982247 4727 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.982260 4727 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.982272 4727 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.982291 4727 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.982304 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm9dp\" (UniqueName: \"kubernetes.io/projected/9622021e-ef0b-4274-a356-f61405a2dd9b-kube-api-access-cm9dp\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.982317 4727 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.982330 4727 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:00 crc kubenswrapper[4727]: I1001 13:19:00.982342 4727 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9622021e-ef0b-4274-a356-f61405a2dd9b-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.322899 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" event={"ID":"9622021e-ef0b-4274-a356-f61405a2dd9b","Type":"ContainerDied","Data":"60efc73cf876f3e59633ff23042e38e9c044c65a274f58ee8f181eaca7dd38b6"} Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.323416 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60efc73cf876f3e59633ff23042e38e9c044c65a274f58ee8f181eaca7dd38b6" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.323106 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9b767" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.439226 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs"] Oct 01 13:19:01 crc kubenswrapper[4727]: E1001 13:19:01.439770 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9622021e-ef0b-4274-a356-f61405a2dd9b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.439792 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="9622021e-ef0b-4274-a356-f61405a2dd9b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.439980 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="9622021e-ef0b-4274-a356-f61405a2dd9b" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.440769 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.445509 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.446177 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jcjb6" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.445501 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs"] Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.450594 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.450590 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.464064 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.494453 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.494504 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.494674 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.495082 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.495124 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.495574 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.495610 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x9mh\" (UniqueName: \"kubernetes.io/projected/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-kube-api-access-6x9mh\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.598471 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.598571 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.598690 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.598722 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x9mh\" (UniqueName: \"kubernetes.io/projected/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-kube-api-access-6x9mh\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.598856 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.598929 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.600890 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.605534 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.605731 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.606036 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.606726 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.606951 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.607161 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.620149 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x9mh\" (UniqueName: \"kubernetes.io/projected/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-kube-api-access-6x9mh\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" Oct 01 13:19:01 crc kubenswrapper[4727]: I1001 13:19:01.766633 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" Oct 01 13:19:02 crc kubenswrapper[4727]: I1001 13:19:02.353522 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs"] Oct 01 13:19:02 crc kubenswrapper[4727]: I1001 13:19:02.365815 4727 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:19:03 crc kubenswrapper[4727]: I1001 13:19:03.019323 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:19:03 crc kubenswrapper[4727]: I1001 13:19:03.346688 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" event={"ID":"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf","Type":"ContainerStarted","Data":"2e7e200a51b1ed76edefa4c7880313856f47a90a6c1d0ac9bbc469a44bd8cdbd"} Oct 01 13:19:03 crc kubenswrapper[4727]: I1001 13:19:03.346763 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" event={"ID":"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf","Type":"ContainerStarted","Data":"2d12b09315a64447dcb803a50c4d7d15e78f2aa48a3763d3d79fab02b321b42f"} Oct 01 13:19:03 crc kubenswrapper[4727]: I1001 13:19:03.368845 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" podStartSLOduration=1.717565337 podStartE2EDuration="2.368823851s" podCreationTimestamp="2025-10-01 13:19:01 +0000 UTC" firstStartedPulling="2025-10-01 13:19:02.365567473 +0000 UTC m=+2520.686922310" lastFinishedPulling="2025-10-01 13:19:03.016825987 +0000 UTC m=+2521.338180824" observedRunningTime="2025-10-01 13:19:03.36786477 +0000 UTC m=+2521.689219607" watchObservedRunningTime="2025-10-01 13:19:03.368823851 +0000 UTC m=+2521.690178698" Oct 01 13:19:05 crc kubenswrapper[4727]: I1001 13:19:05.373369 4727 scope.go:117] "RemoveContainer" containerID="fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2" Oct 01 13:19:05 crc kubenswrapper[4727]: E1001 13:19:05.374181 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:19:17 crc kubenswrapper[4727]: I1001 13:19:17.372630 4727 scope.go:117] "RemoveContainer" containerID="fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2" Oct 01 13:19:17 crc kubenswrapper[4727]: E1001 13:19:17.374388 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:19:30 crc kubenswrapper[4727]: I1001 13:19:30.373344 4727 scope.go:117] "RemoveContainer" containerID="fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2" Oct 01 13:19:30 crc kubenswrapper[4727]: E1001 13:19:30.374161 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:19:45 crc kubenswrapper[4727]: I1001 13:19:45.372935 4727 scope.go:117] "RemoveContainer" containerID="fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2" Oct 01 13:19:45 crc kubenswrapper[4727]: E1001 13:19:45.373821 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:19:58 crc kubenswrapper[4727]: I1001 13:19:58.373105 4727 scope.go:117] "RemoveContainer" containerID="fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2" Oct 01 13:19:58 crc kubenswrapper[4727]: E1001 13:19:58.373894 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:20:13 crc kubenswrapper[4727]: I1001 13:20:13.372407 4727 scope.go:117] "RemoveContainer" containerID="fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2" Oct 01 13:20:14 crc kubenswrapper[4727]: I1001 13:20:14.026118 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" event={"ID":"d18290ae-64a5-44a5-a704-90977d85852b","Type":"ContainerStarted","Data":"0b20f9df355cf8a1786e71a3d4bf9a8db762df0e7ec9fae3b46c91317a229a05"} Oct 01 13:22:02 crc kubenswrapper[4727]: I1001 13:22:02.105896 4727 generic.go:334] "Generic (PLEG): container finished" podID="9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf" containerID="2e7e200a51b1ed76edefa4c7880313856f47a90a6c1d0ac9bbc469a44bd8cdbd" exitCode=0 Oct 01 13:22:02 crc kubenswrapper[4727]: I1001 13:22:02.106057 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" event={"ID":"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf","Type":"ContainerDied","Data":"2e7e200a51b1ed76edefa4c7880313856f47a90a6c1d0ac9bbc469a44bd8cdbd"} Oct 01 13:22:03 crc kubenswrapper[4727]: I1001 13:22:03.497148 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" Oct 01 13:22:03 crc kubenswrapper[4727]: I1001 13:22:03.650799 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-ceilometer-compute-config-data-0\") pod \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " Oct 01 13:22:03 crc kubenswrapper[4727]: I1001 13:22:03.650896 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-inventory\") pod \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " Oct 01 13:22:03 crc kubenswrapper[4727]: I1001 13:22:03.650924 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x9mh\" (UniqueName: \"kubernetes.io/projected/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-kube-api-access-6x9mh\") pod \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " Oct 01 13:22:03 crc kubenswrapper[4727]: I1001 13:22:03.650952 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-ssh-key\") pod \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " Oct 01 13:22:03 crc kubenswrapper[4727]: I1001 13:22:03.651138 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-telemetry-combined-ca-bundle\") pod \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " Oct 01 13:22:03 crc kubenswrapper[4727]: I1001 13:22:03.651165 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-ceilometer-compute-config-data-1\") pod \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " Oct 01 13:22:03 crc kubenswrapper[4727]: I1001 13:22:03.651195 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-ceilometer-compute-config-data-2\") pod \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\" (UID: \"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf\") " Oct 01 13:22:03 crc kubenswrapper[4727]: I1001 13:22:03.658052 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-kube-api-access-6x9mh" (OuterVolumeSpecName: "kube-api-access-6x9mh") pod "9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf" (UID: "9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf"). InnerVolumeSpecName "kube-api-access-6x9mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:22:03 crc kubenswrapper[4727]: I1001 13:22:03.662480 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf" (UID: "9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:22:03 crc kubenswrapper[4727]: I1001 13:22:03.685688 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf" (UID: "9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:22:03 crc kubenswrapper[4727]: I1001 13:22:03.688329 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf" (UID: "9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:22:03 crc kubenswrapper[4727]: I1001 13:22:03.688687 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf" (UID: "9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:22:03 crc kubenswrapper[4727]: I1001 13:22:03.691477 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf" (UID: "9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:22:03 crc kubenswrapper[4727]: I1001 13:22:03.692348 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-inventory" (OuterVolumeSpecName: "inventory") pod "9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf" (UID: "9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:22:03 crc kubenswrapper[4727]: I1001 13:22:03.753880 4727 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:03 crc kubenswrapper[4727]: I1001 13:22:03.753926 4727 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:03 crc kubenswrapper[4727]: I1001 13:22:03.753938 4727 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:03 crc kubenswrapper[4727]: I1001 13:22:03.753949 4727 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:03 crc kubenswrapper[4727]: I1001 13:22:03.753959 4727 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:03 crc kubenswrapper[4727]: I1001 13:22:03.753969 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x9mh\" (UniqueName: \"kubernetes.io/projected/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-kube-api-access-6x9mh\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:03 crc kubenswrapper[4727]: I1001 13:22:03.754014 4727 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:04 crc kubenswrapper[4727]: I1001 13:22:04.126891 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" event={"ID":"9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf","Type":"ContainerDied","Data":"2d12b09315a64447dcb803a50c4d7d15e78f2aa48a3763d3d79fab02b321b42f"} Oct 01 13:22:04 crc kubenswrapper[4727]: I1001 13:22:04.127365 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d12b09315a64447dcb803a50c4d7d15e78f2aa48a3763d3d79fab02b321b42f" Oct 01 13:22:04 crc kubenswrapper[4727]: I1001 13:22:04.126971 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs" Oct 01 13:22:12 crc kubenswrapper[4727]: I1001 13:22:12.132258 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 13:22:12 crc kubenswrapper[4727]: I1001 13:22:12.133681 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="b6ef1c58-1426-4b49-90ff-9b5ee9cb6890" containerName="kube-state-metrics" containerID="cri-o://bb93fcb4fee965b183faaf931aaafc7286da3adf594dfb681fb2718101404c8f" gracePeriod=30 Oct 01 13:22:12 crc kubenswrapper[4727]: I1001 13:22:12.198562 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:22:12 crc kubenswrapper[4727]: I1001 13:22:12.198888 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7f4fac25-c782-4f4c-ab50-62969ea1f369" containerName="ceilometer-central-agent" containerID="cri-o://63dc9bc5f7734cbdfca6a4fc141e49feea3354c4b446f4d0ca4b5712359ba399" gracePeriod=30 Oct 01 13:22:12 crc kubenswrapper[4727]: I1001 13:22:12.199426 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7f4fac25-c782-4f4c-ab50-62969ea1f369" containerName="proxy-httpd" containerID="cri-o://944afde0e103fc73e7a47ed130e684d0c440d8e628f93c3b732a5a221fd4c011" gracePeriod=30 Oct 01 13:22:12 crc kubenswrapper[4727]: I1001 13:22:12.199491 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7f4fac25-c782-4f4c-ab50-62969ea1f369" containerName="sg-core" containerID="cri-o://f4923fa89c2267ae488b8d88230846a1379770af16007d56e8cc20a9ecff0ee3" gracePeriod=30 Oct 01 13:22:12 crc kubenswrapper[4727]: I1001 13:22:12.199539 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7f4fac25-c782-4f4c-ab50-62969ea1f369" containerName="ceilometer-notification-agent" containerID="cri-o://e56268bdcc5fec5c8ab83a474aee84267e4ed72ce3a0fe6f4b4f95ede11cbd63" gracePeriod=30 Oct 01 13:22:12 crc kubenswrapper[4727]: I1001 13:22:12.780127 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 13:22:12 crc kubenswrapper[4727]: I1001 13:22:12.867240 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ef1c58-1426-4b49-90ff-9b5ee9cb6890-combined-ca-bundle\") pod \"b6ef1c58-1426-4b49-90ff-9b5ee9cb6890\" (UID: \"b6ef1c58-1426-4b49-90ff-9b5ee9cb6890\") " Oct 01 13:22:12 crc kubenswrapper[4727]: I1001 13:22:12.867331 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b6ef1c58-1426-4b49-90ff-9b5ee9cb6890-kube-state-metrics-tls-config\") pod \"b6ef1c58-1426-4b49-90ff-9b5ee9cb6890\" (UID: \"b6ef1c58-1426-4b49-90ff-9b5ee9cb6890\") " Oct 01 13:22:12 crc kubenswrapper[4727]: I1001 13:22:12.867453 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frcwj\" (UniqueName: \"kubernetes.io/projected/b6ef1c58-1426-4b49-90ff-9b5ee9cb6890-kube-api-access-frcwj\") pod \"b6ef1c58-1426-4b49-90ff-9b5ee9cb6890\" (UID: \"b6ef1c58-1426-4b49-90ff-9b5ee9cb6890\") " Oct 01 13:22:12 crc kubenswrapper[4727]: I1001 13:22:12.867532 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6ef1c58-1426-4b49-90ff-9b5ee9cb6890-kube-state-metrics-tls-certs\") pod \"b6ef1c58-1426-4b49-90ff-9b5ee9cb6890\" (UID: \"b6ef1c58-1426-4b49-90ff-9b5ee9cb6890\") " Oct 01 13:22:12 crc kubenswrapper[4727]: I1001 13:22:12.873894 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6ef1c58-1426-4b49-90ff-9b5ee9cb6890-kube-api-access-frcwj" (OuterVolumeSpecName: "kube-api-access-frcwj") pod "b6ef1c58-1426-4b49-90ff-9b5ee9cb6890" (UID: "b6ef1c58-1426-4b49-90ff-9b5ee9cb6890"). InnerVolumeSpecName "kube-api-access-frcwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:22:12 crc kubenswrapper[4727]: I1001 13:22:12.897907 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6ef1c58-1426-4b49-90ff-9b5ee9cb6890-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6ef1c58-1426-4b49-90ff-9b5ee9cb6890" (UID: "b6ef1c58-1426-4b49-90ff-9b5ee9cb6890"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:22:12 crc kubenswrapper[4727]: I1001 13:22:12.900974 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6ef1c58-1426-4b49-90ff-9b5ee9cb6890-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "b6ef1c58-1426-4b49-90ff-9b5ee9cb6890" (UID: "b6ef1c58-1426-4b49-90ff-9b5ee9cb6890"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:22:12 crc kubenswrapper[4727]: I1001 13:22:12.919639 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6ef1c58-1426-4b49-90ff-9b5ee9cb6890-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "b6ef1c58-1426-4b49-90ff-9b5ee9cb6890" (UID: "b6ef1c58-1426-4b49-90ff-9b5ee9cb6890"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:22:12 crc kubenswrapper[4727]: I1001 13:22:12.969928 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ef1c58-1426-4b49-90ff-9b5ee9cb6890-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:12 crc kubenswrapper[4727]: I1001 13:22:12.969981 4727 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b6ef1c58-1426-4b49-90ff-9b5ee9cb6890-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:12 crc kubenswrapper[4727]: I1001 13:22:12.969995 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frcwj\" (UniqueName: \"kubernetes.io/projected/b6ef1c58-1426-4b49-90ff-9b5ee9cb6890-kube-api-access-frcwj\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:12 crc kubenswrapper[4727]: I1001 13:22:12.970011 4727 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6ef1c58-1426-4b49-90ff-9b5ee9cb6890-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:13 crc kubenswrapper[4727]: I1001 13:22:13.258919 4727 generic.go:334] "Generic (PLEG): container finished" podID="7f4fac25-c782-4f4c-ab50-62969ea1f369" containerID="944afde0e103fc73e7a47ed130e684d0c440d8e628f93c3b732a5a221fd4c011" exitCode=0 Oct 01 13:22:13 crc kubenswrapper[4727]: I1001 13:22:13.258983 4727 generic.go:334] "Generic (PLEG): container finished" podID="7f4fac25-c782-4f4c-ab50-62969ea1f369" containerID="f4923fa89c2267ae488b8d88230846a1379770af16007d56e8cc20a9ecff0ee3" exitCode=2 Oct 01 13:22:13 crc kubenswrapper[4727]: I1001 13:22:13.259016 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f4fac25-c782-4f4c-ab50-62969ea1f369","Type":"ContainerDied","Data":"944afde0e103fc73e7a47ed130e684d0c440d8e628f93c3b732a5a221fd4c011"} Oct 01 13:22:13 crc kubenswrapper[4727]: I1001 13:22:13.259090 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f4fac25-c782-4f4c-ab50-62969ea1f369","Type":"ContainerDied","Data":"f4923fa89c2267ae488b8d88230846a1379770af16007d56e8cc20a9ecff0ee3"} Oct 01 13:22:13 crc kubenswrapper[4727]: I1001 13:22:13.259109 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f4fac25-c782-4f4c-ab50-62969ea1f369","Type":"ContainerDied","Data":"63dc9bc5f7734cbdfca6a4fc141e49feea3354c4b446f4d0ca4b5712359ba399"} Oct 01 13:22:13 crc kubenswrapper[4727]: I1001 13:22:13.258993 4727 generic.go:334] "Generic (PLEG): container finished" podID="7f4fac25-c782-4f4c-ab50-62969ea1f369" containerID="63dc9bc5f7734cbdfca6a4fc141e49feea3354c4b446f4d0ca4b5712359ba399" exitCode=0 Oct 01 13:22:13 crc kubenswrapper[4727]: I1001 13:22:13.261396 4727 generic.go:334] "Generic (PLEG): container finished" podID="b6ef1c58-1426-4b49-90ff-9b5ee9cb6890" containerID="bb93fcb4fee965b183faaf931aaafc7286da3adf594dfb681fb2718101404c8f" exitCode=2 Oct 01 13:22:13 crc kubenswrapper[4727]: I1001 13:22:13.261444 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b6ef1c58-1426-4b49-90ff-9b5ee9cb6890","Type":"ContainerDied","Data":"bb93fcb4fee965b183faaf931aaafc7286da3adf594dfb681fb2718101404c8f"} Oct 01 13:22:13 crc kubenswrapper[4727]: I1001 13:22:13.261480 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b6ef1c58-1426-4b49-90ff-9b5ee9cb6890","Type":"ContainerDied","Data":"878bb83299b7f8353c2bb08c6fe70b100f33939e0812e46f4e57d4ca5501a5a9"} Oct 01 13:22:13 crc kubenswrapper[4727]: I1001 13:22:13.261508 4727 scope.go:117] "RemoveContainer" containerID="bb93fcb4fee965b183faaf931aaafc7286da3adf594dfb681fb2718101404c8f" Oct 01 13:22:13 crc kubenswrapper[4727]: I1001 13:22:13.261757 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 13:22:13 crc kubenswrapper[4727]: I1001 13:22:13.331376 4727 scope.go:117] "RemoveContainer" containerID="bb93fcb4fee965b183faaf931aaafc7286da3adf594dfb681fb2718101404c8f" Oct 01 13:22:13 crc kubenswrapper[4727]: E1001 13:22:13.332071 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb93fcb4fee965b183faaf931aaafc7286da3adf594dfb681fb2718101404c8f\": container with ID starting with bb93fcb4fee965b183faaf931aaafc7286da3adf594dfb681fb2718101404c8f not found: ID does not exist" containerID="bb93fcb4fee965b183faaf931aaafc7286da3adf594dfb681fb2718101404c8f" Oct 01 13:22:13 crc kubenswrapper[4727]: I1001 13:22:13.332138 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb93fcb4fee965b183faaf931aaafc7286da3adf594dfb681fb2718101404c8f"} err="failed to get container status \"bb93fcb4fee965b183faaf931aaafc7286da3adf594dfb681fb2718101404c8f\": rpc error: code = NotFound desc = could not find container \"bb93fcb4fee965b183faaf931aaafc7286da3adf594dfb681fb2718101404c8f\": container with ID starting with bb93fcb4fee965b183faaf931aaafc7286da3adf594dfb681fb2718101404c8f not found: ID does not exist" Oct 01 13:22:13 crc kubenswrapper[4727]: I1001 13:22:13.344581 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 13:22:13 crc kubenswrapper[4727]: I1001 13:22:13.355209 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.293539 4727 generic.go:334] "Generic (PLEG): container finished" podID="7f4fac25-c782-4f4c-ab50-62969ea1f369" containerID="e56268bdcc5fec5c8ab83a474aee84267e4ed72ce3a0fe6f4b4f95ede11cbd63" exitCode=0 Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.293649 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f4fac25-c782-4f4c-ab50-62969ea1f369","Type":"ContainerDied","Data":"e56268bdcc5fec5c8ab83a474aee84267e4ed72ce3a0fe6f4b4f95ede11cbd63"} Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.388286 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6ef1c58-1426-4b49-90ff-9b5ee9cb6890" path="/var/lib/kubelet/pods/b6ef1c58-1426-4b49-90ff-9b5ee9cb6890/volumes" Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.642815 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.706261 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f4fac25-c782-4f4c-ab50-62969ea1f369-log-httpd\") pod \"7f4fac25-c782-4f4c-ab50-62969ea1f369\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.706384 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-ceilometer-tls-certs\") pod \"7f4fac25-c782-4f4c-ab50-62969ea1f369\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.706426 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-sg-core-conf-yaml\") pod \"7f4fac25-c782-4f4c-ab50-62969ea1f369\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.706458 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f4fac25-c782-4f4c-ab50-62969ea1f369-run-httpd\") pod \"7f4fac25-c782-4f4c-ab50-62969ea1f369\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.706537 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdpt6\" (UniqueName: \"kubernetes.io/projected/7f4fac25-c782-4f4c-ab50-62969ea1f369-kube-api-access-sdpt6\") pod \"7f4fac25-c782-4f4c-ab50-62969ea1f369\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.706588 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-scripts\") pod \"7f4fac25-c782-4f4c-ab50-62969ea1f369\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.706642 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-combined-ca-bundle\") pod \"7f4fac25-c782-4f4c-ab50-62969ea1f369\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.706680 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-config-data\") pod \"7f4fac25-c782-4f4c-ab50-62969ea1f369\" (UID: \"7f4fac25-c782-4f4c-ab50-62969ea1f369\") " Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.707415 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f4fac25-c782-4f4c-ab50-62969ea1f369-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7f4fac25-c782-4f4c-ab50-62969ea1f369" (UID: "7f4fac25-c782-4f4c-ab50-62969ea1f369"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.708294 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f4fac25-c782-4f4c-ab50-62969ea1f369-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7f4fac25-c782-4f4c-ab50-62969ea1f369" (UID: "7f4fac25-c782-4f4c-ab50-62969ea1f369"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.721411 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-scripts" (OuterVolumeSpecName: "scripts") pod "7f4fac25-c782-4f4c-ab50-62969ea1f369" (UID: "7f4fac25-c782-4f4c-ab50-62969ea1f369"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.721642 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f4fac25-c782-4f4c-ab50-62969ea1f369-kube-api-access-sdpt6" (OuterVolumeSpecName: "kube-api-access-sdpt6") pod "7f4fac25-c782-4f4c-ab50-62969ea1f369" (UID: "7f4fac25-c782-4f4c-ab50-62969ea1f369"). InnerVolumeSpecName "kube-api-access-sdpt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.752071 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7f4fac25-c782-4f4c-ab50-62969ea1f369" (UID: "7f4fac25-c782-4f4c-ab50-62969ea1f369"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.778908 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7f4fac25-c782-4f4c-ab50-62969ea1f369" (UID: "7f4fac25-c782-4f4c-ab50-62969ea1f369"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.808919 4727 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f4fac25-c782-4f4c-ab50-62969ea1f369-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.808967 4727 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.808978 4727 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.808986 4727 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f4fac25-c782-4f4c-ab50-62969ea1f369-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.809021 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdpt6\" (UniqueName: \"kubernetes.io/projected/7f4fac25-c782-4f4c-ab50-62969ea1f369-kube-api-access-sdpt6\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.809034 4727 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.809469 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f4fac25-c782-4f4c-ab50-62969ea1f369" (UID: "7f4fac25-c782-4f4c-ab50-62969ea1f369"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.863188 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-config-data" (OuterVolumeSpecName: "config-data") pod "7f4fac25-c782-4f4c-ab50-62969ea1f369" (UID: "7f4fac25-c782-4f4c-ab50-62969ea1f369"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.910771 4727 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:14 crc kubenswrapper[4727]: I1001 13:22:14.910809 4727 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f4fac25-c782-4f4c-ab50-62969ea1f369-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:22:15 crc kubenswrapper[4727]: I1001 13:22:15.315097 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7f4fac25-c782-4f4c-ab50-62969ea1f369","Type":"ContainerDied","Data":"873bb98802845f78b22d9e2886a79aec9ddd7a17f018e3f2ad02a9a67c6c1414"} Oct 01 13:22:15 crc kubenswrapper[4727]: I1001 13:22:15.315194 4727 scope.go:117] "RemoveContainer" containerID="944afde0e103fc73e7a47ed130e684d0c440d8e628f93c3b732a5a221fd4c011" Oct 01 13:22:15 crc kubenswrapper[4727]: I1001 13:22:15.315331 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:22:15 crc kubenswrapper[4727]: I1001 13:22:15.346220 4727 scope.go:117] "RemoveContainer" containerID="f4923fa89c2267ae488b8d88230846a1379770af16007d56e8cc20a9ecff0ee3" Oct 01 13:22:15 crc kubenswrapper[4727]: I1001 13:22:15.359291 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:22:15 crc kubenswrapper[4727]: I1001 13:22:15.367721 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:22:15 crc kubenswrapper[4727]: I1001 13:22:15.379071 4727 scope.go:117] "RemoveContainer" containerID="e56268bdcc5fec5c8ab83a474aee84267e4ed72ce3a0fe6f4b4f95ede11cbd63" Oct 01 13:22:15 crc kubenswrapper[4727]: I1001 13:22:15.406197 4727 scope.go:117] "RemoveContainer" containerID="63dc9bc5f7734cbdfca6a4fc141e49feea3354c4b446f4d0ca4b5712359ba399" Oct 01 13:22:16 crc kubenswrapper[4727]: I1001 13:22:16.386577 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f4fac25-c782-4f4c-ab50-62969ea1f369" path="/var/lib/kubelet/pods/7f4fac25-c782-4f4c-ab50-62969ea1f369/volumes" Oct 01 13:22:27 crc kubenswrapper[4727]: I1001 13:22:27.397325 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d6wxt/must-gather-2jrvc"] Oct 01 13:22:27 crc kubenswrapper[4727]: E1001 13:22:27.398487 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 01 13:22:27 crc kubenswrapper[4727]: I1001 13:22:27.398505 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 01 13:22:27 crc kubenswrapper[4727]: E1001 13:22:27.398526 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f4fac25-c782-4f4c-ab50-62969ea1f369" containerName="ceilometer-notification-agent" Oct 01 13:22:27 crc kubenswrapper[4727]: I1001 13:22:27.398532 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f4fac25-c782-4f4c-ab50-62969ea1f369" containerName="ceilometer-notification-agent" Oct 01 13:22:27 crc kubenswrapper[4727]: E1001 13:22:27.398553 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6ef1c58-1426-4b49-90ff-9b5ee9cb6890" containerName="kube-state-metrics" Oct 01 13:22:27 crc kubenswrapper[4727]: I1001 13:22:27.398560 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6ef1c58-1426-4b49-90ff-9b5ee9cb6890" containerName="kube-state-metrics" Oct 01 13:22:27 crc kubenswrapper[4727]: E1001 13:22:27.398571 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f4fac25-c782-4f4c-ab50-62969ea1f369" containerName="sg-core" Oct 01 13:22:27 crc kubenswrapper[4727]: I1001 13:22:27.398577 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f4fac25-c782-4f4c-ab50-62969ea1f369" containerName="sg-core" Oct 01 13:22:27 crc kubenswrapper[4727]: E1001 13:22:27.398590 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f4fac25-c782-4f4c-ab50-62969ea1f369" containerName="ceilometer-central-agent" Oct 01 13:22:27 crc kubenswrapper[4727]: I1001 13:22:27.398596 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f4fac25-c782-4f4c-ab50-62969ea1f369" containerName="ceilometer-central-agent" Oct 01 13:22:27 crc kubenswrapper[4727]: E1001 13:22:27.398603 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f4fac25-c782-4f4c-ab50-62969ea1f369" containerName="proxy-httpd" Oct 01 13:22:27 crc kubenswrapper[4727]: I1001 13:22:27.398609 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f4fac25-c782-4f4c-ab50-62969ea1f369" containerName="proxy-httpd" Oct 01 13:22:27 crc kubenswrapper[4727]: I1001 13:22:27.398776 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f4fac25-c782-4f4c-ab50-62969ea1f369" containerName="proxy-httpd" Oct 01 13:22:27 crc kubenswrapper[4727]: I1001 13:22:27.398799 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f4fac25-c782-4f4c-ab50-62969ea1f369" containerName="ceilometer-central-agent" Oct 01 13:22:27 crc kubenswrapper[4727]: I1001 13:22:27.398814 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 01 13:22:27 crc kubenswrapper[4727]: I1001 13:22:27.398827 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f4fac25-c782-4f4c-ab50-62969ea1f369" containerName="ceilometer-notification-agent" Oct 01 13:22:27 crc kubenswrapper[4727]: I1001 13:22:27.398834 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f4fac25-c782-4f4c-ab50-62969ea1f369" containerName="sg-core" Oct 01 13:22:27 crc kubenswrapper[4727]: I1001 13:22:27.398843 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6ef1c58-1426-4b49-90ff-9b5ee9cb6890" containerName="kube-state-metrics" Oct 01 13:22:27 crc kubenswrapper[4727]: I1001 13:22:27.399854 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6wxt/must-gather-2jrvc" Oct 01 13:22:27 crc kubenswrapper[4727]: I1001 13:22:27.411246 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-d6wxt"/"default-dockercfg-s2nc2" Oct 01 13:22:27 crc kubenswrapper[4727]: I1001 13:22:27.411247 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-d6wxt"/"openshift-service-ca.crt" Oct 01 13:22:27 crc kubenswrapper[4727]: I1001 13:22:27.411320 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-d6wxt"/"kube-root-ca.crt" Oct 01 13:22:27 crc kubenswrapper[4727]: I1001 13:22:27.438738 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d6wxt/must-gather-2jrvc"] Oct 01 13:22:27 crc kubenswrapper[4727]: I1001 13:22:27.558197 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/479fa617-cf9f-4bf7-9290-5833831b934b-must-gather-output\") pod \"must-gather-2jrvc\" (UID: \"479fa617-cf9f-4bf7-9290-5833831b934b\") " pod="openshift-must-gather-d6wxt/must-gather-2jrvc" Oct 01 13:22:27 crc kubenswrapper[4727]: I1001 13:22:27.558425 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhndb\" (UniqueName: \"kubernetes.io/projected/479fa617-cf9f-4bf7-9290-5833831b934b-kube-api-access-rhndb\") pod \"must-gather-2jrvc\" (UID: \"479fa617-cf9f-4bf7-9290-5833831b934b\") " pod="openshift-must-gather-d6wxt/must-gather-2jrvc" Oct 01 13:22:27 crc kubenswrapper[4727]: I1001 13:22:27.659859 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhndb\" (UniqueName: \"kubernetes.io/projected/479fa617-cf9f-4bf7-9290-5833831b934b-kube-api-access-rhndb\") pod \"must-gather-2jrvc\" (UID: \"479fa617-cf9f-4bf7-9290-5833831b934b\") " pod="openshift-must-gather-d6wxt/must-gather-2jrvc" Oct 01 13:22:27 crc kubenswrapper[4727]: I1001 13:22:27.659969 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/479fa617-cf9f-4bf7-9290-5833831b934b-must-gather-output\") pod \"must-gather-2jrvc\" (UID: \"479fa617-cf9f-4bf7-9290-5833831b934b\") " pod="openshift-must-gather-d6wxt/must-gather-2jrvc" Oct 01 13:22:27 crc kubenswrapper[4727]: I1001 13:22:27.660908 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/479fa617-cf9f-4bf7-9290-5833831b934b-must-gather-output\") pod \"must-gather-2jrvc\" (UID: \"479fa617-cf9f-4bf7-9290-5833831b934b\") " pod="openshift-must-gather-d6wxt/must-gather-2jrvc" Oct 01 13:22:27 crc kubenswrapper[4727]: I1001 13:22:27.700717 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhndb\" (UniqueName: \"kubernetes.io/projected/479fa617-cf9f-4bf7-9290-5833831b934b-kube-api-access-rhndb\") pod \"must-gather-2jrvc\" (UID: \"479fa617-cf9f-4bf7-9290-5833831b934b\") " pod="openshift-must-gather-d6wxt/must-gather-2jrvc" Oct 01 13:22:27 crc kubenswrapper[4727]: I1001 13:22:27.726512 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6wxt/must-gather-2jrvc" Oct 01 13:22:28 crc kubenswrapper[4727]: I1001 13:22:28.250406 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d6wxt/must-gather-2jrvc"] Oct 01 13:22:28 crc kubenswrapper[4727]: W1001 13:22:28.252430 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod479fa617_cf9f_4bf7_9290_5833831b934b.slice/crio-b992e6f322a06d006a0f44e55f009160d540029fb50361590b79b075f09bd134 WatchSource:0}: Error finding container b992e6f322a06d006a0f44e55f009160d540029fb50361590b79b075f09bd134: Status 404 returned error can't find the container with id b992e6f322a06d006a0f44e55f009160d540029fb50361590b79b075f09bd134 Oct 01 13:22:28 crc kubenswrapper[4727]: I1001 13:22:28.458220 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6wxt/must-gather-2jrvc" event={"ID":"479fa617-cf9f-4bf7-9290-5833831b934b","Type":"ContainerStarted","Data":"b992e6f322a06d006a0f44e55f009160d540029fb50361590b79b075f09bd134"} Oct 01 13:22:33 crc kubenswrapper[4727]: I1001 13:22:33.292515 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:22:33 crc kubenswrapper[4727]: I1001 13:22:33.293591 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:22:36 crc kubenswrapper[4727]: I1001 13:22:36.554742 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6wxt/must-gather-2jrvc" event={"ID":"479fa617-cf9f-4bf7-9290-5833831b934b","Type":"ContainerStarted","Data":"e6c580c297dbab7a6d8786dc91a12f7272db08e8312eb4814f7107daab6e0171"} Oct 01 13:22:36 crc kubenswrapper[4727]: I1001 13:22:36.555403 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6wxt/must-gather-2jrvc" event={"ID":"479fa617-cf9f-4bf7-9290-5833831b934b","Type":"ContainerStarted","Data":"dbd39415acaebb0aabbc33cbbc19629236ef70f90fe2c1943ea4e16d007cd5eb"} Oct 01 13:22:36 crc kubenswrapper[4727]: I1001 13:22:36.588063 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d6wxt/must-gather-2jrvc" podStartSLOduration=2.304181213 podStartE2EDuration="9.588035773s" podCreationTimestamp="2025-10-01 13:22:27 +0000 UTC" firstStartedPulling="2025-10-01 13:22:28.256071843 +0000 UTC m=+2726.577426680" lastFinishedPulling="2025-10-01 13:22:35.539926403 +0000 UTC m=+2733.861281240" observedRunningTime="2025-10-01 13:22:36.58636108 +0000 UTC m=+2734.907715927" watchObservedRunningTime="2025-10-01 13:22:36.588035773 +0000 UTC m=+2734.909390630" Oct 01 13:22:41 crc kubenswrapper[4727]: I1001 13:22:41.844960 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d6wxt/crc-debug-m798g"] Oct 01 13:22:41 crc kubenswrapper[4727]: I1001 13:22:41.847564 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6wxt/crc-debug-m798g" Oct 01 13:22:41 crc kubenswrapper[4727]: I1001 13:22:41.984618 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4381656-3c9d-44d0-a003-01b7d8b91b19-host\") pod \"crc-debug-m798g\" (UID: \"e4381656-3c9d-44d0-a003-01b7d8b91b19\") " pod="openshift-must-gather-d6wxt/crc-debug-m798g" Oct 01 13:22:41 crc kubenswrapper[4727]: I1001 13:22:41.984954 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s46hl\" (UniqueName: \"kubernetes.io/projected/e4381656-3c9d-44d0-a003-01b7d8b91b19-kube-api-access-s46hl\") pod \"crc-debug-m798g\" (UID: \"e4381656-3c9d-44d0-a003-01b7d8b91b19\") " pod="openshift-must-gather-d6wxt/crc-debug-m798g" Oct 01 13:22:42 crc kubenswrapper[4727]: I1001 13:22:42.086852 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4381656-3c9d-44d0-a003-01b7d8b91b19-host\") pod \"crc-debug-m798g\" (UID: \"e4381656-3c9d-44d0-a003-01b7d8b91b19\") " pod="openshift-must-gather-d6wxt/crc-debug-m798g" Oct 01 13:22:42 crc kubenswrapper[4727]: I1001 13:22:42.086904 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s46hl\" (UniqueName: \"kubernetes.io/projected/e4381656-3c9d-44d0-a003-01b7d8b91b19-kube-api-access-s46hl\") pod \"crc-debug-m798g\" (UID: \"e4381656-3c9d-44d0-a003-01b7d8b91b19\") " pod="openshift-must-gather-d6wxt/crc-debug-m798g" Oct 01 13:22:42 crc kubenswrapper[4727]: I1001 13:22:42.087093 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4381656-3c9d-44d0-a003-01b7d8b91b19-host\") pod \"crc-debug-m798g\" (UID: \"e4381656-3c9d-44d0-a003-01b7d8b91b19\") " pod="openshift-must-gather-d6wxt/crc-debug-m798g" Oct 01 13:22:42 crc kubenswrapper[4727]: I1001 13:22:42.120630 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s46hl\" (UniqueName: \"kubernetes.io/projected/e4381656-3c9d-44d0-a003-01b7d8b91b19-kube-api-access-s46hl\") pod \"crc-debug-m798g\" (UID: \"e4381656-3c9d-44d0-a003-01b7d8b91b19\") " pod="openshift-must-gather-d6wxt/crc-debug-m798g" Oct 01 13:22:42 crc kubenswrapper[4727]: I1001 13:22:42.166019 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6wxt/crc-debug-m798g" Oct 01 13:22:42 crc kubenswrapper[4727]: I1001 13:22:42.639668 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6wxt/crc-debug-m798g" event={"ID":"e4381656-3c9d-44d0-a003-01b7d8b91b19","Type":"ContainerStarted","Data":"243886b07328d907bcb1cfb9f2c58b566279fb9819c508384491ad1c0cd0325a"} Oct 01 13:22:54 crc kubenswrapper[4727]: I1001 13:22:54.776641 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6wxt/crc-debug-m798g" event={"ID":"e4381656-3c9d-44d0-a003-01b7d8b91b19","Type":"ContainerStarted","Data":"3ed39f8cbb5272e90e822965dee6caf1bbf712401b23ad0b558642ced7ab89ee"} Oct 01 13:22:54 crc kubenswrapper[4727]: I1001 13:22:54.794820 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d6wxt/crc-debug-m798g" podStartSLOduration=1.807917904 podStartE2EDuration="13.794798581s" podCreationTimestamp="2025-10-01 13:22:41 +0000 UTC" firstStartedPulling="2025-10-01 13:22:42.204064959 +0000 UTC m=+2740.525419796" lastFinishedPulling="2025-10-01 13:22:54.190945636 +0000 UTC m=+2752.512300473" observedRunningTime="2025-10-01 13:22:54.791369324 +0000 UTC m=+2753.112724161" watchObservedRunningTime="2025-10-01 13:22:54.794798581 +0000 UTC m=+2753.116153418" Oct 01 13:23:03 crc kubenswrapper[4727]: I1001 13:23:03.291936 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:23:03 crc kubenswrapper[4727]: I1001 13:23:03.292644 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:23:33 crc kubenswrapper[4727]: I1001 13:23:33.292462 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:23:33 crc kubenswrapper[4727]: I1001 13:23:33.292983 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:23:33 crc kubenswrapper[4727]: I1001 13:23:33.293048 4727 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" Oct 01 13:23:33 crc kubenswrapper[4727]: I1001 13:23:33.293847 4727 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0b20f9df355cf8a1786e71a3d4bf9a8db762df0e7ec9fae3b46c91317a229a05"} pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:23:33 crc kubenswrapper[4727]: I1001 13:23:33.293968 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" containerID="cri-o://0b20f9df355cf8a1786e71a3d4bf9a8db762df0e7ec9fae3b46c91317a229a05" gracePeriod=600 Oct 01 13:23:34 crc kubenswrapper[4727]: I1001 13:23:34.151105 4727 generic.go:334] "Generic (PLEG): container finished" podID="d18290ae-64a5-44a5-a704-90977d85852b" containerID="0b20f9df355cf8a1786e71a3d4bf9a8db762df0e7ec9fae3b46c91317a229a05" exitCode=0 Oct 01 13:23:34 crc kubenswrapper[4727]: I1001 13:23:34.151227 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" event={"ID":"d18290ae-64a5-44a5-a704-90977d85852b","Type":"ContainerDied","Data":"0b20f9df355cf8a1786e71a3d4bf9a8db762df0e7ec9fae3b46c91317a229a05"} Oct 01 13:23:34 crc kubenswrapper[4727]: I1001 13:23:34.151893 4727 scope.go:117] "RemoveContainer" containerID="fa339b971ff7a7a28f68635f2f92cccac939d454156b4e9ce60dba49f570fde2" Oct 01 13:23:34 crc kubenswrapper[4727]: I1001 13:23:34.153633 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" event={"ID":"d18290ae-64a5-44a5-a704-90977d85852b","Type":"ContainerStarted","Data":"21070767e1e9adb1fdd93b30f0f7a9f9798d3c489a6476ac64ea84ad6be77128"} Oct 01 13:23:37 crc kubenswrapper[4727]: I1001 13:23:37.832716 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wr9rj"] Oct 01 13:23:37 crc kubenswrapper[4727]: I1001 13:23:37.835898 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wr9rj" Oct 01 13:23:37 crc kubenswrapper[4727]: I1001 13:23:37.860154 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wr9rj"] Oct 01 13:23:37 crc kubenswrapper[4727]: I1001 13:23:37.929388 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/becbb755-aeb5-408a-935e-b86cb927f62d-catalog-content\") pod \"community-operators-wr9rj\" (UID: \"becbb755-aeb5-408a-935e-b86cb927f62d\") " pod="openshift-marketplace/community-operators-wr9rj" Oct 01 13:23:37 crc kubenswrapper[4727]: I1001 13:23:37.929597 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/becbb755-aeb5-408a-935e-b86cb927f62d-utilities\") pod \"community-operators-wr9rj\" (UID: \"becbb755-aeb5-408a-935e-b86cb927f62d\") " pod="openshift-marketplace/community-operators-wr9rj" Oct 01 13:23:37 crc kubenswrapper[4727]: I1001 13:23:37.929827 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlp69\" (UniqueName: \"kubernetes.io/projected/becbb755-aeb5-408a-935e-b86cb927f62d-kube-api-access-dlp69\") pod \"community-operators-wr9rj\" (UID: \"becbb755-aeb5-408a-935e-b86cb927f62d\") " pod="openshift-marketplace/community-operators-wr9rj" Oct 01 13:23:38 crc kubenswrapper[4727]: I1001 13:23:38.030629 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/becbb755-aeb5-408a-935e-b86cb927f62d-catalog-content\") pod \"community-operators-wr9rj\" (UID: \"becbb755-aeb5-408a-935e-b86cb927f62d\") " pod="openshift-marketplace/community-operators-wr9rj" Oct 01 13:23:38 crc kubenswrapper[4727]: I1001 13:23:38.030698 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/becbb755-aeb5-408a-935e-b86cb927f62d-utilities\") pod \"community-operators-wr9rj\" (UID: \"becbb755-aeb5-408a-935e-b86cb927f62d\") " pod="openshift-marketplace/community-operators-wr9rj" Oct 01 13:23:38 crc kubenswrapper[4727]: I1001 13:23:38.030772 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlp69\" (UniqueName: \"kubernetes.io/projected/becbb755-aeb5-408a-935e-b86cb927f62d-kube-api-access-dlp69\") pod \"community-operators-wr9rj\" (UID: \"becbb755-aeb5-408a-935e-b86cb927f62d\") " pod="openshift-marketplace/community-operators-wr9rj" Oct 01 13:23:38 crc kubenswrapper[4727]: I1001 13:23:38.031619 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/becbb755-aeb5-408a-935e-b86cb927f62d-catalog-content\") pod \"community-operators-wr9rj\" (UID: \"becbb755-aeb5-408a-935e-b86cb927f62d\") " pod="openshift-marketplace/community-operators-wr9rj" Oct 01 13:23:38 crc kubenswrapper[4727]: I1001 13:23:38.031847 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/becbb755-aeb5-408a-935e-b86cb927f62d-utilities\") pod \"community-operators-wr9rj\" (UID: \"becbb755-aeb5-408a-935e-b86cb927f62d\") " pod="openshift-marketplace/community-operators-wr9rj" Oct 01 13:23:38 crc kubenswrapper[4727]: I1001 13:23:38.066637 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlp69\" (UniqueName: \"kubernetes.io/projected/becbb755-aeb5-408a-935e-b86cb927f62d-kube-api-access-dlp69\") pod \"community-operators-wr9rj\" (UID: \"becbb755-aeb5-408a-935e-b86cb927f62d\") " pod="openshift-marketplace/community-operators-wr9rj" Oct 01 13:23:38 crc kubenswrapper[4727]: I1001 13:23:38.170437 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wr9rj" Oct 01 13:23:38 crc kubenswrapper[4727]: I1001 13:23:38.826298 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wr9rj"] Oct 01 13:23:39 crc kubenswrapper[4727]: I1001 13:23:39.213526 4727 generic.go:334] "Generic (PLEG): container finished" podID="becbb755-aeb5-408a-935e-b86cb927f62d" containerID="7d00f3337fadd5833aa4dfcc332f1cde25400a949a94f4deca7550565f4839e9" exitCode=0 Oct 01 13:23:39 crc kubenswrapper[4727]: I1001 13:23:39.213889 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wr9rj" event={"ID":"becbb755-aeb5-408a-935e-b86cb927f62d","Type":"ContainerDied","Data":"7d00f3337fadd5833aa4dfcc332f1cde25400a949a94f4deca7550565f4839e9"} Oct 01 13:23:39 crc kubenswrapper[4727]: I1001 13:23:39.214010 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wr9rj" event={"ID":"becbb755-aeb5-408a-935e-b86cb927f62d","Type":"ContainerStarted","Data":"5edc4eb5bb15d2436f3ae2a952f288c54b9c67d9f8f56550a023704218bb47fd"} Oct 01 13:23:40 crc kubenswrapper[4727]: I1001 13:23:40.226428 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-868ff"] Oct 01 13:23:40 crc kubenswrapper[4727]: I1001 13:23:40.229701 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wr9rj" event={"ID":"becbb755-aeb5-408a-935e-b86cb927f62d","Type":"ContainerStarted","Data":"46c5abae5068f0f39413d9cd6d010e1bd6099ae9f2fef6780d63ccb236ea4e17"} Oct 01 13:23:40 crc kubenswrapper[4727]: I1001 13:23:40.229800 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-868ff" Oct 01 13:23:40 crc kubenswrapper[4727]: I1001 13:23:40.262727 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-868ff"] Oct 01 13:23:40 crc kubenswrapper[4727]: I1001 13:23:40.377515 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faedd224-4efd-43f6-b18d-4c126f7a5353-utilities\") pod \"redhat-marketplace-868ff\" (UID: \"faedd224-4efd-43f6-b18d-4c126f7a5353\") " pod="openshift-marketplace/redhat-marketplace-868ff" Oct 01 13:23:40 crc kubenswrapper[4727]: I1001 13:23:40.377862 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faedd224-4efd-43f6-b18d-4c126f7a5353-catalog-content\") pod \"redhat-marketplace-868ff\" (UID: \"faedd224-4efd-43f6-b18d-4c126f7a5353\") " pod="openshift-marketplace/redhat-marketplace-868ff" Oct 01 13:23:40 crc kubenswrapper[4727]: I1001 13:23:40.378036 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8l4z\" (UniqueName: \"kubernetes.io/projected/faedd224-4efd-43f6-b18d-4c126f7a5353-kube-api-access-v8l4z\") pod \"redhat-marketplace-868ff\" (UID: \"faedd224-4efd-43f6-b18d-4c126f7a5353\") " pod="openshift-marketplace/redhat-marketplace-868ff" Oct 01 13:23:40 crc kubenswrapper[4727]: I1001 13:23:40.479864 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faedd224-4efd-43f6-b18d-4c126f7a5353-utilities\") pod \"redhat-marketplace-868ff\" (UID: \"faedd224-4efd-43f6-b18d-4c126f7a5353\") " pod="openshift-marketplace/redhat-marketplace-868ff" Oct 01 13:23:40 crc kubenswrapper[4727]: I1001 13:23:40.480331 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faedd224-4efd-43f6-b18d-4c126f7a5353-catalog-content\") pod \"redhat-marketplace-868ff\" (UID: \"faedd224-4efd-43f6-b18d-4c126f7a5353\") " pod="openshift-marketplace/redhat-marketplace-868ff" Oct 01 13:23:40 crc kubenswrapper[4727]: I1001 13:23:40.480474 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8l4z\" (UniqueName: \"kubernetes.io/projected/faedd224-4efd-43f6-b18d-4c126f7a5353-kube-api-access-v8l4z\") pod \"redhat-marketplace-868ff\" (UID: \"faedd224-4efd-43f6-b18d-4c126f7a5353\") " pod="openshift-marketplace/redhat-marketplace-868ff" Oct 01 13:23:40 crc kubenswrapper[4727]: I1001 13:23:40.480711 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faedd224-4efd-43f6-b18d-4c126f7a5353-utilities\") pod \"redhat-marketplace-868ff\" (UID: \"faedd224-4efd-43f6-b18d-4c126f7a5353\") " pod="openshift-marketplace/redhat-marketplace-868ff" Oct 01 13:23:40 crc kubenswrapper[4727]: I1001 13:23:40.481024 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faedd224-4efd-43f6-b18d-4c126f7a5353-catalog-content\") pod \"redhat-marketplace-868ff\" (UID: \"faedd224-4efd-43f6-b18d-4c126f7a5353\") " pod="openshift-marketplace/redhat-marketplace-868ff" Oct 01 13:23:40 crc kubenswrapper[4727]: I1001 13:23:40.524486 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8l4z\" (UniqueName: \"kubernetes.io/projected/faedd224-4efd-43f6-b18d-4c126f7a5353-kube-api-access-v8l4z\") pod \"redhat-marketplace-868ff\" (UID: \"faedd224-4efd-43f6-b18d-4c126f7a5353\") " pod="openshift-marketplace/redhat-marketplace-868ff" Oct 01 13:23:40 crc kubenswrapper[4727]: I1001 13:23:40.560799 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-868ff" Oct 01 13:23:41 crc kubenswrapper[4727]: I1001 13:23:41.133038 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-868ff"] Oct 01 13:23:41 crc kubenswrapper[4727]: I1001 13:23:41.245297 4727 generic.go:334] "Generic (PLEG): container finished" podID="becbb755-aeb5-408a-935e-b86cb927f62d" containerID="46c5abae5068f0f39413d9cd6d010e1bd6099ae9f2fef6780d63ccb236ea4e17" exitCode=0 Oct 01 13:23:41 crc kubenswrapper[4727]: I1001 13:23:41.245400 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wr9rj" event={"ID":"becbb755-aeb5-408a-935e-b86cb927f62d","Type":"ContainerDied","Data":"46c5abae5068f0f39413d9cd6d010e1bd6099ae9f2fef6780d63ccb236ea4e17"} Oct 01 13:23:41 crc kubenswrapper[4727]: I1001 13:23:41.248206 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-868ff" event={"ID":"faedd224-4efd-43f6-b18d-4c126f7a5353","Type":"ContainerStarted","Data":"ed86f15be5191973f79fb798436929a25020b9a3031956597c34627bee95d9ff"} Oct 01 13:23:42 crc kubenswrapper[4727]: I1001 13:23:42.260079 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wr9rj" event={"ID":"becbb755-aeb5-408a-935e-b86cb927f62d","Type":"ContainerStarted","Data":"13fb2a4f7adde7d12cdee703a2178de8699f9db16e7d00cb0c0397aff98903d7"} Oct 01 13:23:42 crc kubenswrapper[4727]: I1001 13:23:42.262435 4727 generic.go:334] "Generic (PLEG): container finished" podID="faedd224-4efd-43f6-b18d-4c126f7a5353" containerID="ad5db14fdb8c4af88916fe3918e0b4af5046dbb534acc7c847bdebac98e08fc5" exitCode=0 Oct 01 13:23:42 crc kubenswrapper[4727]: I1001 13:23:42.262488 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-868ff" event={"ID":"faedd224-4efd-43f6-b18d-4c126f7a5353","Type":"ContainerDied","Data":"ad5db14fdb8c4af88916fe3918e0b4af5046dbb534acc7c847bdebac98e08fc5"} Oct 01 13:23:42 crc kubenswrapper[4727]: I1001 13:23:42.284826 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wr9rj" podStartSLOduration=2.605731905 podStartE2EDuration="5.284797161s" podCreationTimestamp="2025-10-01 13:23:37 +0000 UTC" firstStartedPulling="2025-10-01 13:23:39.216694473 +0000 UTC m=+2797.538049310" lastFinishedPulling="2025-10-01 13:23:41.895759729 +0000 UTC m=+2800.217114566" observedRunningTime="2025-10-01 13:23:42.277704338 +0000 UTC m=+2800.599059185" watchObservedRunningTime="2025-10-01 13:23:42.284797161 +0000 UTC m=+2800.606151998" Oct 01 13:23:43 crc kubenswrapper[4727]: I1001 13:23:43.275466 4727 generic.go:334] "Generic (PLEG): container finished" podID="faedd224-4efd-43f6-b18d-4c126f7a5353" containerID="5a1c2098597e2c5a828b4d45cc98894c6010b89876c6de74a39810e2a1f54f84" exitCode=0 Oct 01 13:23:43 crc kubenswrapper[4727]: I1001 13:23:43.277197 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-868ff" event={"ID":"faedd224-4efd-43f6-b18d-4c126f7a5353","Type":"ContainerDied","Data":"5a1c2098597e2c5a828b4d45cc98894c6010b89876c6de74a39810e2a1f54f84"} Oct 01 13:23:44 crc kubenswrapper[4727]: I1001 13:23:44.288159 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-868ff" event={"ID":"faedd224-4efd-43f6-b18d-4c126f7a5353","Type":"ContainerStarted","Data":"5cf7857b1bea70f5b2c5e5b1a077b416c3e6d5223a39c38bc7737cec8133aae9"} Oct 01 13:23:44 crc kubenswrapper[4727]: I1001 13:23:44.318512 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-868ff" podStartSLOduration=2.837086768 podStartE2EDuration="4.318488249s" podCreationTimestamp="2025-10-01 13:23:40 +0000 UTC" firstStartedPulling="2025-10-01 13:23:42.264880896 +0000 UTC m=+2800.586235733" lastFinishedPulling="2025-10-01 13:23:43.746282377 +0000 UTC m=+2802.067637214" observedRunningTime="2025-10-01 13:23:44.306108 +0000 UTC m=+2802.627462847" watchObservedRunningTime="2025-10-01 13:23:44.318488249 +0000 UTC m=+2802.639843086" Oct 01 13:23:47 crc kubenswrapper[4727]: I1001 13:23:47.424085 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lmtnt"] Oct 01 13:23:47 crc kubenswrapper[4727]: I1001 13:23:47.426752 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmtnt" Oct 01 13:23:47 crc kubenswrapper[4727]: I1001 13:23:47.443129 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lmtnt"] Oct 01 13:23:47 crc kubenswrapper[4727]: I1001 13:23:47.529960 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3713fdfb-7dbf-4f1c-bc32-542c815532a7-catalog-content\") pod \"redhat-operators-lmtnt\" (UID: \"3713fdfb-7dbf-4f1c-bc32-542c815532a7\") " pod="openshift-marketplace/redhat-operators-lmtnt" Oct 01 13:23:47 crc kubenswrapper[4727]: I1001 13:23:47.530017 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3713fdfb-7dbf-4f1c-bc32-542c815532a7-utilities\") pod \"redhat-operators-lmtnt\" (UID: \"3713fdfb-7dbf-4f1c-bc32-542c815532a7\") " pod="openshift-marketplace/redhat-operators-lmtnt" Oct 01 13:23:47 crc kubenswrapper[4727]: I1001 13:23:47.530066 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmg2z\" (UniqueName: \"kubernetes.io/projected/3713fdfb-7dbf-4f1c-bc32-542c815532a7-kube-api-access-pmg2z\") pod \"redhat-operators-lmtnt\" (UID: \"3713fdfb-7dbf-4f1c-bc32-542c815532a7\") " pod="openshift-marketplace/redhat-operators-lmtnt" Oct 01 13:23:47 crc kubenswrapper[4727]: I1001 13:23:47.631617 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3713fdfb-7dbf-4f1c-bc32-542c815532a7-catalog-content\") pod \"redhat-operators-lmtnt\" (UID: \"3713fdfb-7dbf-4f1c-bc32-542c815532a7\") " pod="openshift-marketplace/redhat-operators-lmtnt" Oct 01 13:23:47 crc kubenswrapper[4727]: I1001 13:23:47.631668 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3713fdfb-7dbf-4f1c-bc32-542c815532a7-utilities\") pod \"redhat-operators-lmtnt\" (UID: \"3713fdfb-7dbf-4f1c-bc32-542c815532a7\") " pod="openshift-marketplace/redhat-operators-lmtnt" Oct 01 13:23:47 crc kubenswrapper[4727]: I1001 13:23:47.631718 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmg2z\" (UniqueName: \"kubernetes.io/projected/3713fdfb-7dbf-4f1c-bc32-542c815532a7-kube-api-access-pmg2z\") pod \"redhat-operators-lmtnt\" (UID: \"3713fdfb-7dbf-4f1c-bc32-542c815532a7\") " pod="openshift-marketplace/redhat-operators-lmtnt" Oct 01 13:23:47 crc kubenswrapper[4727]: I1001 13:23:47.632159 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3713fdfb-7dbf-4f1c-bc32-542c815532a7-utilities\") pod \"redhat-operators-lmtnt\" (UID: \"3713fdfb-7dbf-4f1c-bc32-542c815532a7\") " pod="openshift-marketplace/redhat-operators-lmtnt" Oct 01 13:23:47 crc kubenswrapper[4727]: I1001 13:23:47.632417 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3713fdfb-7dbf-4f1c-bc32-542c815532a7-catalog-content\") pod \"redhat-operators-lmtnt\" (UID: \"3713fdfb-7dbf-4f1c-bc32-542c815532a7\") " pod="openshift-marketplace/redhat-operators-lmtnt" Oct 01 13:23:47 crc kubenswrapper[4727]: I1001 13:23:47.657655 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmg2z\" (UniqueName: \"kubernetes.io/projected/3713fdfb-7dbf-4f1c-bc32-542c815532a7-kube-api-access-pmg2z\") pod \"redhat-operators-lmtnt\" (UID: \"3713fdfb-7dbf-4f1c-bc32-542c815532a7\") " pod="openshift-marketplace/redhat-operators-lmtnt" Oct 01 13:23:47 crc kubenswrapper[4727]: I1001 13:23:47.759757 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmtnt" Oct 01 13:23:47 crc kubenswrapper[4727]: I1001 13:23:47.970617 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6977f5dffd-4k4rv_81bb6eec-986d-4589-b007-dc88fcf1832b/barbican-api/0.log" Oct 01 13:23:48 crc kubenswrapper[4727]: I1001 13:23:48.036942 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6977f5dffd-4k4rv_81bb6eec-986d-4589-b007-dc88fcf1832b/barbican-api-log/0.log" Oct 01 13:23:48 crc kubenswrapper[4727]: I1001 13:23:48.143403 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lmtnt"] Oct 01 13:23:48 crc kubenswrapper[4727]: I1001 13:23:48.170811 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wr9rj" Oct 01 13:23:48 crc kubenswrapper[4727]: I1001 13:23:48.171146 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wr9rj" Oct 01 13:23:48 crc kubenswrapper[4727]: I1001 13:23:48.352279 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wr9rj" Oct 01 13:23:48 crc kubenswrapper[4727]: I1001 13:23:48.365769 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmtnt" event={"ID":"3713fdfb-7dbf-4f1c-bc32-542c815532a7","Type":"ContainerStarted","Data":"eea7cf0b389c1182d0dc213091261869c445c97680bfd00ca807215bfae8cc5e"} Oct 01 13:23:48 crc kubenswrapper[4727]: I1001 13:23:48.477751 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wr9rj" Oct 01 13:23:48 crc kubenswrapper[4727]: I1001 13:23:48.645154 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7485b6955d-qw7r5_14e20fde-b925-4ef7-b5f8-4b6a50544990/barbican-keystone-listener/0.log" Oct 01 13:23:48 crc kubenswrapper[4727]: I1001 13:23:48.830258 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7485b6955d-qw7r5_14e20fde-b925-4ef7-b5f8-4b6a50544990/barbican-keystone-listener-log/0.log" Oct 01 13:23:48 crc kubenswrapper[4727]: I1001 13:23:48.941630 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7d4477b597-fvt5b_0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9/barbican-worker/0.log" Oct 01 13:23:49 crc kubenswrapper[4727]: I1001 13:23:49.111552 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7d4477b597-fvt5b_0e691195-3fb7-4bfc-8a9f-09a35b6c9eb9/barbican-worker-log/0.log" Oct 01 13:23:49 crc kubenswrapper[4727]: I1001 13:23:49.379203 4727 generic.go:334] "Generic (PLEG): container finished" podID="3713fdfb-7dbf-4f1c-bc32-542c815532a7" containerID="a30bd551ea9a4d7c812fabaa28232e3edb8b6921fbfb8dedb82bfa5ba7bbea87" exitCode=0 Oct 01 13:23:49 crc kubenswrapper[4727]: I1001 13:23:49.379312 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmtnt" event={"ID":"3713fdfb-7dbf-4f1c-bc32-542c815532a7","Type":"ContainerDied","Data":"a30bd551ea9a4d7c812fabaa28232e3edb8b6921fbfb8dedb82bfa5ba7bbea87"} Oct 01 13:23:49 crc kubenswrapper[4727]: I1001 13:23:49.505391 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-c4hbx_2f2522c5-4bf3-4d82-af9a-546abdb6c4be/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 13:23:49 crc kubenswrapper[4727]: I1001 13:23:49.595584 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_73a88e6f-bc10-4121-a28a-e9f1ae533e6a/cinder-api/0.log" Oct 01 13:23:49 crc kubenswrapper[4727]: I1001 13:23:49.744349 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_73a88e6f-bc10-4121-a28a-e9f1ae533e6a/cinder-api-log/0.log" Oct 01 13:23:49 crc kubenswrapper[4727]: I1001 13:23:49.842721 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b26dc55a-af6d-4797-a736-e1ad576aef99/cinder-scheduler/0.log" Oct 01 13:23:50 crc kubenswrapper[4727]: I1001 13:23:50.028938 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b26dc55a-af6d-4797-a736-e1ad576aef99/probe/0.log" Oct 01 13:23:50 crc kubenswrapper[4727]: I1001 13:23:50.124668 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-kwnkt_2d0a1f80-62ee-4cc4-9a3b-48e7289508c4/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 13:23:50 crc kubenswrapper[4727]: I1001 13:23:50.350630 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-gbd4j_0e5dff42-2d16-4c83-adc0-bdad8d122cc3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 13:23:50 crc kubenswrapper[4727]: I1001 13:23:50.561022 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-868ff" Oct 01 13:23:50 crc kubenswrapper[4727]: I1001 13:23:50.561077 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-868ff" Oct 01 13:23:50 crc kubenswrapper[4727]: I1001 13:23:50.577927 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-lnfhz_5a70d26d-18ff-4550-b1eb-a720a38162ee/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 13:23:50 crc kubenswrapper[4727]: I1001 13:23:50.621690 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wr9rj"] Oct 01 13:23:50 crc kubenswrapper[4727]: I1001 13:23:50.632718 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-868ff" Oct 01 13:23:50 crc kubenswrapper[4727]: I1001 13:23:50.787557 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-65bcw_aa6f6783-3b1f-4c21-aee4-6f35cf66d17f/init/0.log" Oct 01 13:23:51 crc kubenswrapper[4727]: I1001 13:23:51.028044 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-65bcw_aa6f6783-3b1f-4c21-aee4-6f35cf66d17f/init/0.log" Oct 01 13:23:51 crc kubenswrapper[4727]: I1001 13:23:51.053371 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-65bcw_aa6f6783-3b1f-4c21-aee4-6f35cf66d17f/dnsmasq-dns/0.log" Oct 01 13:23:51 crc kubenswrapper[4727]: I1001 13:23:51.410349 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmtnt" event={"ID":"3713fdfb-7dbf-4f1c-bc32-542c815532a7","Type":"ContainerStarted","Data":"54864066eb34ca18bad6144b8b46cd65f1efedc96491abad6f590eeec8f85aa8"} Oct 01 13:23:51 crc kubenswrapper[4727]: I1001 13:23:51.410719 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wr9rj" podUID="becbb755-aeb5-408a-935e-b86cb927f62d" containerName="registry-server" containerID="cri-o://13fb2a4f7adde7d12cdee703a2178de8699f9db16e7d00cb0c0397aff98903d7" gracePeriod=2 Oct 01 13:23:51 crc kubenswrapper[4727]: I1001 13:23:51.415763 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-vkhcj_0f819364-69c9-47d2-9876-82a3081ab579/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 13:23:51 crc kubenswrapper[4727]: I1001 13:23:51.422494 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3a3b47a8-5894-49fc-a9a6-9a5f9062b439/glance-httpd/0.log" Oct 01 13:23:51 crc kubenswrapper[4727]: I1001 13:23:51.477688 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-868ff" Oct 01 13:23:51 crc kubenswrapper[4727]: I1001 13:23:51.658515 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3a3b47a8-5894-49fc-a9a6-9a5f9062b439/glance-log/0.log" Oct 01 13:23:51 crc kubenswrapper[4727]: I1001 13:23:51.723700 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_325d7807-2792-4b29-bbfe-154c6af17f6d/glance-httpd/0.log" Oct 01 13:23:51 crc kubenswrapper[4727]: I1001 13:23:51.751629 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_325d7807-2792-4b29-bbfe-154c6af17f6d/glance-log/0.log" Oct 01 13:23:51 crc kubenswrapper[4727]: I1001 13:23:51.976421 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-b5wgr_44e3fbd7-8cb7-462a-9990-f3e82b978c55/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 13:23:52 crc kubenswrapper[4727]: I1001 13:23:52.017097 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-wm46g_388b066d-a9db-4f3c-a0e1-c03c12aac2df/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 13:23:52 crc kubenswrapper[4727]: I1001 13:23:52.285142 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7f4d7ff84c-l27rl_80e31dea-5550-409a-8f5e-5eec07106dcd/keystone-api/0.log" Oct 01 13:23:52 crc kubenswrapper[4727]: I1001 13:23:52.299687 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29322061-hhrsh_afedada7-a84e-4fdc-94f1-feb3b93398d1/keystone-cron/0.log" Oct 01 13:23:52 crc kubenswrapper[4727]: I1001 13:23:52.467818 4727 generic.go:334] "Generic (PLEG): container finished" podID="becbb755-aeb5-408a-935e-b86cb927f62d" containerID="13fb2a4f7adde7d12cdee703a2178de8699f9db16e7d00cb0c0397aff98903d7" exitCode=0 Oct 01 13:23:52 crc kubenswrapper[4727]: I1001 13:23:52.467905 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wr9rj" event={"ID":"becbb755-aeb5-408a-935e-b86cb927f62d","Type":"ContainerDied","Data":"13fb2a4f7adde7d12cdee703a2178de8699f9db16e7d00cb0c0397aff98903d7"} Oct 01 13:23:52 crc kubenswrapper[4727]: I1001 13:23:52.528697 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wr9rj" Oct 01 13:23:52 crc kubenswrapper[4727]: I1001 13:23:52.615281 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-8mvfm_a1a0a9d4-1dbb-43f2-ac79-07ac0205e34d/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 13:23:52 crc kubenswrapper[4727]: I1001 13:23:52.650273 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/becbb755-aeb5-408a-935e-b86cb927f62d-catalog-content\") pod \"becbb755-aeb5-408a-935e-b86cb927f62d\" (UID: \"becbb755-aeb5-408a-935e-b86cb927f62d\") " Oct 01 13:23:52 crc kubenswrapper[4727]: I1001 13:23:52.650665 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/becbb755-aeb5-408a-935e-b86cb927f62d-utilities\") pod \"becbb755-aeb5-408a-935e-b86cb927f62d\" (UID: \"becbb755-aeb5-408a-935e-b86cb927f62d\") " Oct 01 13:23:52 crc kubenswrapper[4727]: I1001 13:23:52.650798 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlp69\" (UniqueName: \"kubernetes.io/projected/becbb755-aeb5-408a-935e-b86cb927f62d-kube-api-access-dlp69\") pod \"becbb755-aeb5-408a-935e-b86cb927f62d\" (UID: \"becbb755-aeb5-408a-935e-b86cb927f62d\") " Oct 01 13:23:52 crc kubenswrapper[4727]: I1001 13:23:52.651285 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/becbb755-aeb5-408a-935e-b86cb927f62d-utilities" (OuterVolumeSpecName: "utilities") pod "becbb755-aeb5-408a-935e-b86cb927f62d" (UID: "becbb755-aeb5-408a-935e-b86cb927f62d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:23:52 crc kubenswrapper[4727]: I1001 13:23:52.669564 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/becbb755-aeb5-408a-935e-b86cb927f62d-kube-api-access-dlp69" (OuterVolumeSpecName: "kube-api-access-dlp69") pod "becbb755-aeb5-408a-935e-b86cb927f62d" (UID: "becbb755-aeb5-408a-935e-b86cb927f62d"). InnerVolumeSpecName "kube-api-access-dlp69". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:23:52 crc kubenswrapper[4727]: I1001 13:23:52.740282 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/becbb755-aeb5-408a-935e-b86cb927f62d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "becbb755-aeb5-408a-935e-b86cb927f62d" (UID: "becbb755-aeb5-408a-935e-b86cb927f62d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:23:52 crc kubenswrapper[4727]: I1001 13:23:52.753234 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/becbb755-aeb5-408a-935e-b86cb927f62d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:52 crc kubenswrapper[4727]: I1001 13:23:52.753293 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/becbb755-aeb5-408a-935e-b86cb927f62d-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:52 crc kubenswrapper[4727]: I1001 13:23:52.753311 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlp69\" (UniqueName: \"kubernetes.io/projected/becbb755-aeb5-408a-935e-b86cb927f62d-kube-api-access-dlp69\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:52 crc kubenswrapper[4727]: I1001 13:23:52.961890 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-75b5d456dc-grn5w_e1c67043-5e23-4c7a-92b8-b7d1513f1392/neutron-httpd/0.log" Oct 01 13:23:53 crc kubenswrapper[4727]: I1001 13:23:53.020661 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-75b5d456dc-grn5w_e1c67043-5e23-4c7a-92b8-b7d1513f1392/neutron-api/0.log" Oct 01 13:23:53 crc kubenswrapper[4727]: I1001 13:23:53.336445 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-lv27x_55cb8c1d-db01-4bc4-9c27-3a1fed55d823/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 13:23:53 crc kubenswrapper[4727]: I1001 13:23:53.484298 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wr9rj" event={"ID":"becbb755-aeb5-408a-935e-b86cb927f62d","Type":"ContainerDied","Data":"5edc4eb5bb15d2436f3ae2a952f288c54b9c67d9f8f56550a023704218bb47fd"} Oct 01 13:23:53 crc kubenswrapper[4727]: I1001 13:23:53.484392 4727 scope.go:117] "RemoveContainer" containerID="13fb2a4f7adde7d12cdee703a2178de8699f9db16e7d00cb0c0397aff98903d7" Oct 01 13:23:53 crc kubenswrapper[4727]: I1001 13:23:53.484411 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wr9rj" Oct 01 13:23:53 crc kubenswrapper[4727]: I1001 13:23:53.528764 4727 scope.go:117] "RemoveContainer" containerID="46c5abae5068f0f39413d9cd6d010e1bd6099ae9f2fef6780d63ccb236ea4e17" Oct 01 13:23:53 crc kubenswrapper[4727]: I1001 13:23:53.542151 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wr9rj"] Oct 01 13:23:53 crc kubenswrapper[4727]: I1001 13:23:53.550295 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wr9rj"] Oct 01 13:23:53 crc kubenswrapper[4727]: I1001 13:23:53.562378 4727 scope.go:117] "RemoveContainer" containerID="7d00f3337fadd5833aa4dfcc332f1cde25400a949a94f4deca7550565f4839e9" Oct 01 13:23:53 crc kubenswrapper[4727]: I1001 13:23:53.884260 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d1cdebdb-ee16-4183-a5ca-c80527ec9d5e/nova-api-log/0.log" Oct 01 13:23:53 crc kubenswrapper[4727]: I1001 13:23:53.998167 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d1cdebdb-ee16-4183-a5ca-c80527ec9d5e/nova-api-api/0.log" Oct 01 13:23:54 crc kubenswrapper[4727]: I1001 13:23:54.138222 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_72df30c3-21c3-4ff3-b799-0833159289b0/nova-cell0-conductor-conductor/0.log" Oct 01 13:23:54 crc kubenswrapper[4727]: I1001 13:23:54.378361 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_913743df-f049-4011-bbff-2d7abf043bf3/nova-cell1-conductor-conductor/0.log" Oct 01 13:23:54 crc kubenswrapper[4727]: I1001 13:23:54.399282 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="becbb755-aeb5-408a-935e-b86cb927f62d" path="/var/lib/kubelet/pods/becbb755-aeb5-408a-935e-b86cb927f62d/volumes" Oct 01 13:23:54 crc kubenswrapper[4727]: I1001 13:23:54.514291 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_0de3207c-19cd-4cb7-a637-642aa2127265/nova-cell1-novncproxy-novncproxy/0.log" Oct 01 13:23:54 crc kubenswrapper[4727]: I1001 13:23:54.826233 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-868ff"] Oct 01 13:23:54 crc kubenswrapper[4727]: I1001 13:23:54.830723 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-868ff" podUID="faedd224-4efd-43f6-b18d-4c126f7a5353" containerName="registry-server" containerID="cri-o://5cf7857b1bea70f5b2c5e5b1a077b416c3e6d5223a39c38bc7737cec8133aae9" gracePeriod=2 Oct 01 13:23:54 crc kubenswrapper[4727]: I1001 13:23:54.905818 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-9b767_9622021e-ef0b-4274-a356-f61405a2dd9b/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.117202 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca/nova-metadata-log/0.log" Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.382637 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-868ff" Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.522156 4727 generic.go:334] "Generic (PLEG): container finished" podID="faedd224-4efd-43f6-b18d-4c126f7a5353" containerID="5cf7857b1bea70f5b2c5e5b1a077b416c3e6d5223a39c38bc7737cec8133aae9" exitCode=0 Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.522297 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-868ff" Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.522307 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-868ff" event={"ID":"faedd224-4efd-43f6-b18d-4c126f7a5353","Type":"ContainerDied","Data":"5cf7857b1bea70f5b2c5e5b1a077b416c3e6d5223a39c38bc7737cec8133aae9"} Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.522365 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-868ff" event={"ID":"faedd224-4efd-43f6-b18d-4c126f7a5353","Type":"ContainerDied","Data":"ed86f15be5191973f79fb798436929a25020b9a3031956597c34627bee95d9ff"} Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.522387 4727 scope.go:117] "RemoveContainer" containerID="5cf7857b1bea70f5b2c5e5b1a077b416c3e6d5223a39c38bc7737cec8133aae9" Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.526508 4727 generic.go:334] "Generic (PLEG): container finished" podID="3713fdfb-7dbf-4f1c-bc32-542c815532a7" containerID="54864066eb34ca18bad6144b8b46cd65f1efedc96491abad6f590eeec8f85aa8" exitCode=0 Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.526548 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmtnt" event={"ID":"3713fdfb-7dbf-4f1c-bc32-542c815532a7","Type":"ContainerDied","Data":"54864066eb34ca18bad6144b8b46cd65f1efedc96491abad6f590eeec8f85aa8"} Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.538712 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faedd224-4efd-43f6-b18d-4c126f7a5353-utilities\") pod \"faedd224-4efd-43f6-b18d-4c126f7a5353\" (UID: \"faedd224-4efd-43f6-b18d-4c126f7a5353\") " Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.538853 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8l4z\" (UniqueName: \"kubernetes.io/projected/faedd224-4efd-43f6-b18d-4c126f7a5353-kube-api-access-v8l4z\") pod \"faedd224-4efd-43f6-b18d-4c126f7a5353\" (UID: \"faedd224-4efd-43f6-b18d-4c126f7a5353\") " Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.538930 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faedd224-4efd-43f6-b18d-4c126f7a5353-catalog-content\") pod \"faedd224-4efd-43f6-b18d-4c126f7a5353\" (UID: \"faedd224-4efd-43f6-b18d-4c126f7a5353\") " Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.546489 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faedd224-4efd-43f6-b18d-4c126f7a5353-utilities" (OuterVolumeSpecName: "utilities") pod "faedd224-4efd-43f6-b18d-4c126f7a5353" (UID: "faedd224-4efd-43f6-b18d-4c126f7a5353"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.565564 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faedd224-4efd-43f6-b18d-4c126f7a5353-kube-api-access-v8l4z" (OuterVolumeSpecName: "kube-api-access-v8l4z") pod "faedd224-4efd-43f6-b18d-4c126f7a5353" (UID: "faedd224-4efd-43f6-b18d-4c126f7a5353"). InnerVolumeSpecName "kube-api-access-v8l4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.567880 4727 scope.go:117] "RemoveContainer" containerID="5a1c2098597e2c5a828b4d45cc98894c6010b89876c6de74a39810e2a1f54f84" Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.570773 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faedd224-4efd-43f6-b18d-4c126f7a5353-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "faedd224-4efd-43f6-b18d-4c126f7a5353" (UID: "faedd224-4efd-43f6-b18d-4c126f7a5353"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.644264 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faedd224-4efd-43f6-b18d-4c126f7a5353-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.644306 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8l4z\" (UniqueName: \"kubernetes.io/projected/faedd224-4efd-43f6-b18d-4c126f7a5353-kube-api-access-v8l4z\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.644325 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faedd224-4efd-43f6-b18d-4c126f7a5353-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.691523 4727 scope.go:117] "RemoveContainer" containerID="ad5db14fdb8c4af88916fe3918e0b4af5046dbb534acc7c847bdebac98e08fc5" Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.728016 4727 scope.go:117] "RemoveContainer" containerID="5cf7857b1bea70f5b2c5e5b1a077b416c3e6d5223a39c38bc7737cec8133aae9" Oct 01 13:23:55 crc kubenswrapper[4727]: E1001 13:23:55.728825 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cf7857b1bea70f5b2c5e5b1a077b416c3e6d5223a39c38bc7737cec8133aae9\": container with ID starting with 5cf7857b1bea70f5b2c5e5b1a077b416c3e6d5223a39c38bc7737cec8133aae9 not found: ID does not exist" containerID="5cf7857b1bea70f5b2c5e5b1a077b416c3e6d5223a39c38bc7737cec8133aae9" Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.728909 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cf7857b1bea70f5b2c5e5b1a077b416c3e6d5223a39c38bc7737cec8133aae9"} err="failed to get container status \"5cf7857b1bea70f5b2c5e5b1a077b416c3e6d5223a39c38bc7737cec8133aae9\": rpc error: code = NotFound desc = could not find container \"5cf7857b1bea70f5b2c5e5b1a077b416c3e6d5223a39c38bc7737cec8133aae9\": container with ID starting with 5cf7857b1bea70f5b2c5e5b1a077b416c3e6d5223a39c38bc7737cec8133aae9 not found: ID does not exist" Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.728941 4727 scope.go:117] "RemoveContainer" containerID="5a1c2098597e2c5a828b4d45cc98894c6010b89876c6de74a39810e2a1f54f84" Oct 01 13:23:55 crc kubenswrapper[4727]: E1001 13:23:55.729695 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a1c2098597e2c5a828b4d45cc98894c6010b89876c6de74a39810e2a1f54f84\": container with ID starting with 5a1c2098597e2c5a828b4d45cc98894c6010b89876c6de74a39810e2a1f54f84 not found: ID does not exist" containerID="5a1c2098597e2c5a828b4d45cc98894c6010b89876c6de74a39810e2a1f54f84" Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.729771 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a1c2098597e2c5a828b4d45cc98894c6010b89876c6de74a39810e2a1f54f84"} err="failed to get container status \"5a1c2098597e2c5a828b4d45cc98894c6010b89876c6de74a39810e2a1f54f84\": rpc error: code = NotFound desc = could not find container \"5a1c2098597e2c5a828b4d45cc98894c6010b89876c6de74a39810e2a1f54f84\": container with ID starting with 5a1c2098597e2c5a828b4d45cc98894c6010b89876c6de74a39810e2a1f54f84 not found: ID does not exist" Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.729818 4727 scope.go:117] "RemoveContainer" containerID="ad5db14fdb8c4af88916fe3918e0b4af5046dbb534acc7c847bdebac98e08fc5" Oct 01 13:23:55 crc kubenswrapper[4727]: E1001 13:23:55.730421 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad5db14fdb8c4af88916fe3918e0b4af5046dbb534acc7c847bdebac98e08fc5\": container with ID starting with ad5db14fdb8c4af88916fe3918e0b4af5046dbb534acc7c847bdebac98e08fc5 not found: ID does not exist" containerID="ad5db14fdb8c4af88916fe3918e0b4af5046dbb534acc7c847bdebac98e08fc5" Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.730505 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad5db14fdb8c4af88916fe3918e0b4af5046dbb534acc7c847bdebac98e08fc5"} err="failed to get container status \"ad5db14fdb8c4af88916fe3918e0b4af5046dbb534acc7c847bdebac98e08fc5\": rpc error: code = NotFound desc = could not find container \"ad5db14fdb8c4af88916fe3918e0b4af5046dbb534acc7c847bdebac98e08fc5\": container with ID starting with ad5db14fdb8c4af88916fe3918e0b4af5046dbb534acc7c847bdebac98e08fc5 not found: ID does not exist" Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.740944 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_79a1e4c0-8104-4710-91ca-e9a32c934c9b/nova-scheduler-scheduler/0.log" Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.776476 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_01e2d457-092b-4b9d-a5fc-375a59758259/mysql-bootstrap/0.log" Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.863356 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-868ff"] Oct 01 13:23:55 crc kubenswrapper[4727]: I1001 13:23:55.879912 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-868ff"] Oct 01 13:23:56 crc kubenswrapper[4727]: I1001 13:23:56.082494 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_01e2d457-092b-4b9d-a5fc-375a59758259/galera/0.log" Oct 01 13:23:56 crc kubenswrapper[4727]: I1001 13:23:56.090378 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_01e2d457-092b-4b9d-a5fc-375a59758259/mysql-bootstrap/0.log" Oct 01 13:23:56 crc kubenswrapper[4727]: I1001 13:23:56.388850 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faedd224-4efd-43f6-b18d-4c126f7a5353" path="/var/lib/kubelet/pods/faedd224-4efd-43f6-b18d-4c126f7a5353/volumes" Oct 01 13:23:56 crc kubenswrapper[4727]: I1001 13:23:56.432323 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5d8b1b0d-ac72-47a0-a5fb-01e6e2ff46ca/nova-metadata-metadata/0.log" Oct 01 13:23:56 crc kubenswrapper[4727]: I1001 13:23:56.467557 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d10e37bd-ab54-4798-bfa1-a94f2e13eba0/mysql-bootstrap/0.log" Oct 01 13:23:56 crc kubenswrapper[4727]: I1001 13:23:56.733119 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d10e37bd-ab54-4798-bfa1-a94f2e13eba0/mysql-bootstrap/0.log" Oct 01 13:23:56 crc kubenswrapper[4727]: I1001 13:23:56.775943 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d10e37bd-ab54-4798-bfa1-a94f2e13eba0/galera/0.log" Oct 01 13:23:57 crc kubenswrapper[4727]: I1001 13:23:57.093893 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_fc493472-2f4d-4d92-9ba3-22850bd45ae6/openstackclient/0.log" Oct 01 13:23:57 crc kubenswrapper[4727]: I1001 13:23:57.206888 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-fp98w_3d352723-6895-41f9-9ed5-7a90cb94dad6/openstack-network-exporter/0.log" Oct 01 13:23:57 crc kubenswrapper[4727]: I1001 13:23:57.462792 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-f9xhb_85f4f20e-6398-4386-a1b1-d34d0a4159b3/ovsdb-server-init/0.log" Oct 01 13:23:57 crc kubenswrapper[4727]: I1001 13:23:57.633193 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmtnt" event={"ID":"3713fdfb-7dbf-4f1c-bc32-542c815532a7","Type":"ContainerStarted","Data":"52272128c15f27ce600590beb397aa03e9cc7bbfae19f0092d4422be12195455"} Oct 01 13:23:57 crc kubenswrapper[4727]: I1001 13:23:57.651652 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-f9xhb_85f4f20e-6398-4386-a1b1-d34d0a4159b3/ovsdb-server-init/0.log" Oct 01 13:23:57 crc kubenswrapper[4727]: I1001 13:23:57.676612 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lmtnt" podStartSLOduration=3.686484411 podStartE2EDuration="10.676590569s" podCreationTimestamp="2025-10-01 13:23:47 +0000 UTC" firstStartedPulling="2025-10-01 13:23:49.389068794 +0000 UTC m=+2807.710423631" lastFinishedPulling="2025-10-01 13:23:56.379174952 +0000 UTC m=+2814.700529789" observedRunningTime="2025-10-01 13:23:57.669365282 +0000 UTC m=+2815.990720119" watchObservedRunningTime="2025-10-01 13:23:57.676590569 +0000 UTC m=+2815.997945406" Oct 01 13:23:57 crc kubenswrapper[4727]: I1001 13:23:57.760163 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lmtnt" Oct 01 13:23:57 crc kubenswrapper[4727]: I1001 13:23:57.760225 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lmtnt" Oct 01 13:23:57 crc kubenswrapper[4727]: I1001 13:23:57.799581 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-f9xhb_85f4f20e-6398-4386-a1b1-d34d0a4159b3/ovs-vswitchd/0.log" Oct 01 13:23:57 crc kubenswrapper[4727]: I1001 13:23:57.871809 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-f9xhb_85f4f20e-6398-4386-a1b1-d34d0a4159b3/ovsdb-server/0.log" Oct 01 13:23:58 crc kubenswrapper[4727]: I1001 13:23:58.081827 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-v56sx_fb0c554e-ed3f-4476-9963-dabc0089698d/ovn-controller/0.log" Oct 01 13:23:58 crc kubenswrapper[4727]: I1001 13:23:58.322971 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-9cvcv_f6510eaa-3789-48b0-94cc-c300c64714a2/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 13:23:58 crc kubenswrapper[4727]: I1001 13:23:58.583396 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9dcad3e1-a101-49be-a117-fe48c45b2ab5/openstack-network-exporter/0.log" Oct 01 13:23:58 crc kubenswrapper[4727]: I1001 13:23:58.677495 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9dcad3e1-a101-49be-a117-fe48c45b2ab5/ovn-northd/0.log" Oct 01 13:23:58 crc kubenswrapper[4727]: I1001 13:23:58.821802 4727 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lmtnt" podUID="3713fdfb-7dbf-4f1c-bc32-542c815532a7" containerName="registry-server" probeResult="failure" output=< Oct 01 13:23:58 crc kubenswrapper[4727]: timeout: failed to connect service ":50051" within 1s Oct 01 13:23:58 crc kubenswrapper[4727]: > Oct 01 13:23:58 crc kubenswrapper[4727]: I1001 13:23:58.941942 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0eab2ee0-0da5-4935-bb03-270c81efbe57/openstack-network-exporter/0.log" Oct 01 13:23:59 crc kubenswrapper[4727]: I1001 13:23:59.064071 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0eab2ee0-0da5-4935-bb03-270c81efbe57/ovsdbserver-nb/0.log" Oct 01 13:23:59 crc kubenswrapper[4727]: I1001 13:23:59.265500 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_02a48236-0ee4-40a0-b3eb-0f8f8de19b65/openstack-network-exporter/0.log" Oct 01 13:23:59 crc kubenswrapper[4727]: I1001 13:23:59.397574 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_02a48236-0ee4-40a0-b3eb-0f8f8de19b65/ovsdbserver-sb/0.log" Oct 01 13:23:59 crc kubenswrapper[4727]: I1001 13:23:59.526497 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-648455799b-c8jzs_87998a25-3079-49a0-93da-d4326ed0ccc3/placement-api/0.log" Oct 01 13:23:59 crc kubenswrapper[4727]: I1001 13:23:59.767677 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-648455799b-c8jzs_87998a25-3079-49a0-93da-d4326ed0ccc3/placement-log/0.log" Oct 01 13:23:59 crc kubenswrapper[4727]: I1001 13:23:59.897258 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cd3bde15-3916-4632-97e6-50a7a6d2c60f/setup-container/0.log" Oct 01 13:24:00 crc kubenswrapper[4727]: I1001 13:24:00.161591 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cd3bde15-3916-4632-97e6-50a7a6d2c60f/setup-container/0.log" Oct 01 13:24:00 crc kubenswrapper[4727]: I1001 13:24:00.184490 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cd3bde15-3916-4632-97e6-50a7a6d2c60f/rabbitmq/0.log" Oct 01 13:24:00 crc kubenswrapper[4727]: I1001 13:24:00.480786 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4ed96f0b-b8d7-47f1-aa9c-3af04e140681/setup-container/0.log" Oct 01 13:24:00 crc kubenswrapper[4727]: I1001 13:24:00.814972 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4ed96f0b-b8d7-47f1-aa9c-3af04e140681/setup-container/0.log" Oct 01 13:24:00 crc kubenswrapper[4727]: I1001 13:24:00.822688 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4ed96f0b-b8d7-47f1-aa9c-3af04e140681/rabbitmq/0.log" Oct 01 13:24:01 crc kubenswrapper[4727]: I1001 13:24:01.129591 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-n5dx4_0fb255fc-0c75-4a22-9f3c-2edb4f8fc01c/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 13:24:01 crc kubenswrapper[4727]: I1001 13:24:01.217482 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-2pjnj_ef6454da-b104-45bf-870f-feecead2142f/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 13:24:01 crc kubenswrapper[4727]: I1001 13:24:01.446663 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-l4stf_87b03652-a89c-43d2-9cef-c78c540c52a8/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 13:24:01 crc kubenswrapper[4727]: I1001 13:24:01.761352 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-xx7dq_0e21e9bf-b02d-4d3a-9f81-56d94cfaf32a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 13:24:01 crc kubenswrapper[4727]: I1001 13:24:01.798216 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-ksfcf_6ee1dacb-0b88-4de8-aa88-a24d1494ed94/ssh-known-hosts-edpm-deployment/0.log" Oct 01 13:24:01 crc kubenswrapper[4727]: I1001 13:24:01.988258 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7df68f6869-rwfcm_7c5e6c5d-4f10-437f-b20e-f3394093b3b9/proxy-server/0.log" Oct 01 13:24:02 crc kubenswrapper[4727]: I1001 13:24:02.106711 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7df68f6869-rwfcm_7c5e6c5d-4f10-437f-b20e-f3394093b3b9/proxy-httpd/0.log" Oct 01 13:24:02 crc kubenswrapper[4727]: I1001 13:24:02.281933 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-jwf2c_94f8ea93-8124-4473-9a83-c70e83c642f0/swift-ring-rebalance/0.log" Oct 01 13:24:02 crc kubenswrapper[4727]: I1001 13:24:02.463732 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d4f71a40-0089-4219-9ff4-837dfaf28b74/account-auditor/0.log" Oct 01 13:24:02 crc kubenswrapper[4727]: I1001 13:24:02.513944 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d4f71a40-0089-4219-9ff4-837dfaf28b74/account-reaper/0.log" Oct 01 13:24:02 crc kubenswrapper[4727]: I1001 13:24:02.668150 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d4f71a40-0089-4219-9ff4-837dfaf28b74/account-replicator/0.log" Oct 01 13:24:02 crc kubenswrapper[4727]: I1001 13:24:02.732272 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d4f71a40-0089-4219-9ff4-837dfaf28b74/account-server/0.log" Oct 01 13:24:02 crc kubenswrapper[4727]: I1001 13:24:02.768047 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d4f71a40-0089-4219-9ff4-837dfaf28b74/container-auditor/0.log" Oct 01 13:24:02 crc kubenswrapper[4727]: I1001 13:24:02.961303 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d4f71a40-0089-4219-9ff4-837dfaf28b74/container-server/0.log" Oct 01 13:24:02 crc kubenswrapper[4727]: I1001 13:24:02.972573 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d4f71a40-0089-4219-9ff4-837dfaf28b74/container-replicator/0.log" Oct 01 13:24:03 crc kubenswrapper[4727]: I1001 13:24:03.069224 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d4f71a40-0089-4219-9ff4-837dfaf28b74/container-updater/0.log" Oct 01 13:24:03 crc kubenswrapper[4727]: I1001 13:24:03.227197 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d4f71a40-0089-4219-9ff4-837dfaf28b74/object-auditor/0.log" Oct 01 13:24:03 crc kubenswrapper[4727]: I1001 13:24:03.250748 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d4f71a40-0089-4219-9ff4-837dfaf28b74/object-expirer/0.log" Oct 01 13:24:03 crc kubenswrapper[4727]: I1001 13:24:03.386901 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d4f71a40-0089-4219-9ff4-837dfaf28b74/object-replicator/0.log" Oct 01 13:24:03 crc kubenswrapper[4727]: I1001 13:24:03.474101 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d4f71a40-0089-4219-9ff4-837dfaf28b74/object-server/0.log" Oct 01 13:24:03 crc kubenswrapper[4727]: I1001 13:24:03.538851 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d4f71a40-0089-4219-9ff4-837dfaf28b74/object-updater/0.log" Oct 01 13:24:03 crc kubenswrapper[4727]: I1001 13:24:03.682633 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d4f71a40-0089-4219-9ff4-837dfaf28b74/rsync/0.log" Oct 01 13:24:03 crc kubenswrapper[4727]: I1001 13:24:03.725263 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d4f71a40-0089-4219-9ff4-837dfaf28b74/swift-recon-cron/0.log" Oct 01 13:24:04 crc kubenswrapper[4727]: I1001 13:24:04.006075 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-mk2hs_9b8cc0bf-2075-47fa-a5d3-2f7b3cd4cfbf/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 13:24:04 crc kubenswrapper[4727]: I1001 13:24:04.130894 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-l4nwp_8b36bd6d-4297-4c41-a010-4d3b10e169b2/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 13:24:06 crc kubenswrapper[4727]: I1001 13:24:06.451944 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_fca4f477-e812-4926-9935-8bfc1e2ca89a/memcached/0.log" Oct 01 13:24:07 crc kubenswrapper[4727]: I1001 13:24:07.819751 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lmtnt" Oct 01 13:24:07 crc kubenswrapper[4727]: I1001 13:24:07.873521 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lmtnt" Oct 01 13:24:08 crc kubenswrapper[4727]: I1001 13:24:08.077697 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lmtnt"] Oct 01 13:24:09 crc kubenswrapper[4727]: I1001 13:24:09.763166 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lmtnt" podUID="3713fdfb-7dbf-4f1c-bc32-542c815532a7" containerName="registry-server" containerID="cri-o://52272128c15f27ce600590beb397aa03e9cc7bbfae19f0092d4422be12195455" gracePeriod=2 Oct 01 13:24:10 crc kubenswrapper[4727]: I1001 13:24:10.218670 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmtnt" Oct 01 13:24:10 crc kubenswrapper[4727]: I1001 13:24:10.341524 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3713fdfb-7dbf-4f1c-bc32-542c815532a7-catalog-content\") pod \"3713fdfb-7dbf-4f1c-bc32-542c815532a7\" (UID: \"3713fdfb-7dbf-4f1c-bc32-542c815532a7\") " Oct 01 13:24:10 crc kubenswrapper[4727]: I1001 13:24:10.341722 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmg2z\" (UniqueName: \"kubernetes.io/projected/3713fdfb-7dbf-4f1c-bc32-542c815532a7-kube-api-access-pmg2z\") pod \"3713fdfb-7dbf-4f1c-bc32-542c815532a7\" (UID: \"3713fdfb-7dbf-4f1c-bc32-542c815532a7\") " Oct 01 13:24:10 crc kubenswrapper[4727]: I1001 13:24:10.341787 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3713fdfb-7dbf-4f1c-bc32-542c815532a7-utilities\") pod \"3713fdfb-7dbf-4f1c-bc32-542c815532a7\" (UID: \"3713fdfb-7dbf-4f1c-bc32-542c815532a7\") " Oct 01 13:24:10 crc kubenswrapper[4727]: I1001 13:24:10.343133 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3713fdfb-7dbf-4f1c-bc32-542c815532a7-utilities" (OuterVolumeSpecName: "utilities") pod "3713fdfb-7dbf-4f1c-bc32-542c815532a7" (UID: "3713fdfb-7dbf-4f1c-bc32-542c815532a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:24:10 crc kubenswrapper[4727]: I1001 13:24:10.356236 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3713fdfb-7dbf-4f1c-bc32-542c815532a7-kube-api-access-pmg2z" (OuterVolumeSpecName: "kube-api-access-pmg2z") pod "3713fdfb-7dbf-4f1c-bc32-542c815532a7" (UID: "3713fdfb-7dbf-4f1c-bc32-542c815532a7"). InnerVolumeSpecName "kube-api-access-pmg2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:10 crc kubenswrapper[4727]: I1001 13:24:10.439742 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3713fdfb-7dbf-4f1c-bc32-542c815532a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3713fdfb-7dbf-4f1c-bc32-542c815532a7" (UID: "3713fdfb-7dbf-4f1c-bc32-542c815532a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:24:10 crc kubenswrapper[4727]: I1001 13:24:10.444557 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmg2z\" (UniqueName: \"kubernetes.io/projected/3713fdfb-7dbf-4f1c-bc32-542c815532a7-kube-api-access-pmg2z\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:10 crc kubenswrapper[4727]: I1001 13:24:10.444598 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3713fdfb-7dbf-4f1c-bc32-542c815532a7-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:10 crc kubenswrapper[4727]: I1001 13:24:10.444608 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3713fdfb-7dbf-4f1c-bc32-542c815532a7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:10 crc kubenswrapper[4727]: I1001 13:24:10.773868 4727 generic.go:334] "Generic (PLEG): container finished" podID="3713fdfb-7dbf-4f1c-bc32-542c815532a7" containerID="52272128c15f27ce600590beb397aa03e9cc7bbfae19f0092d4422be12195455" exitCode=0 Oct 01 13:24:10 crc kubenswrapper[4727]: I1001 13:24:10.773908 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmtnt" event={"ID":"3713fdfb-7dbf-4f1c-bc32-542c815532a7","Type":"ContainerDied","Data":"52272128c15f27ce600590beb397aa03e9cc7bbfae19f0092d4422be12195455"} Oct 01 13:24:10 crc kubenswrapper[4727]: I1001 13:24:10.773934 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lmtnt" event={"ID":"3713fdfb-7dbf-4f1c-bc32-542c815532a7","Type":"ContainerDied","Data":"eea7cf0b389c1182d0dc213091261869c445c97680bfd00ca807215bfae8cc5e"} Oct 01 13:24:10 crc kubenswrapper[4727]: I1001 13:24:10.773954 4727 scope.go:117] "RemoveContainer" containerID="52272128c15f27ce600590beb397aa03e9cc7bbfae19f0092d4422be12195455" Oct 01 13:24:10 crc kubenswrapper[4727]: I1001 13:24:10.773954 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lmtnt" Oct 01 13:24:10 crc kubenswrapper[4727]: I1001 13:24:10.798089 4727 scope.go:117] "RemoveContainer" containerID="54864066eb34ca18bad6144b8b46cd65f1efedc96491abad6f590eeec8f85aa8" Oct 01 13:24:10 crc kubenswrapper[4727]: I1001 13:24:10.818094 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lmtnt"] Oct 01 13:24:10 crc kubenswrapper[4727]: I1001 13:24:10.826082 4727 scope.go:117] "RemoveContainer" containerID="a30bd551ea9a4d7c812fabaa28232e3edb8b6921fbfb8dedb82bfa5ba7bbea87" Oct 01 13:24:10 crc kubenswrapper[4727]: I1001 13:24:10.832902 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lmtnt"] Oct 01 13:24:10 crc kubenswrapper[4727]: I1001 13:24:10.884539 4727 scope.go:117] "RemoveContainer" containerID="52272128c15f27ce600590beb397aa03e9cc7bbfae19f0092d4422be12195455" Oct 01 13:24:10 crc kubenswrapper[4727]: E1001 13:24:10.887320 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52272128c15f27ce600590beb397aa03e9cc7bbfae19f0092d4422be12195455\": container with ID starting with 52272128c15f27ce600590beb397aa03e9cc7bbfae19f0092d4422be12195455 not found: ID does not exist" containerID="52272128c15f27ce600590beb397aa03e9cc7bbfae19f0092d4422be12195455" Oct 01 13:24:10 crc kubenswrapper[4727]: I1001 13:24:10.887383 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52272128c15f27ce600590beb397aa03e9cc7bbfae19f0092d4422be12195455"} err="failed to get container status \"52272128c15f27ce600590beb397aa03e9cc7bbfae19f0092d4422be12195455\": rpc error: code = NotFound desc = could not find container \"52272128c15f27ce600590beb397aa03e9cc7bbfae19f0092d4422be12195455\": container with ID starting with 52272128c15f27ce600590beb397aa03e9cc7bbfae19f0092d4422be12195455 not found: ID does not exist" Oct 01 13:24:10 crc kubenswrapper[4727]: I1001 13:24:10.887424 4727 scope.go:117] "RemoveContainer" containerID="54864066eb34ca18bad6144b8b46cd65f1efedc96491abad6f590eeec8f85aa8" Oct 01 13:24:10 crc kubenswrapper[4727]: E1001 13:24:10.887838 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54864066eb34ca18bad6144b8b46cd65f1efedc96491abad6f590eeec8f85aa8\": container with ID starting with 54864066eb34ca18bad6144b8b46cd65f1efedc96491abad6f590eeec8f85aa8 not found: ID does not exist" containerID="54864066eb34ca18bad6144b8b46cd65f1efedc96491abad6f590eeec8f85aa8" Oct 01 13:24:10 crc kubenswrapper[4727]: I1001 13:24:10.887874 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54864066eb34ca18bad6144b8b46cd65f1efedc96491abad6f590eeec8f85aa8"} err="failed to get container status \"54864066eb34ca18bad6144b8b46cd65f1efedc96491abad6f590eeec8f85aa8\": rpc error: code = NotFound desc = could not find container \"54864066eb34ca18bad6144b8b46cd65f1efedc96491abad6f590eeec8f85aa8\": container with ID starting with 54864066eb34ca18bad6144b8b46cd65f1efedc96491abad6f590eeec8f85aa8 not found: ID does not exist" Oct 01 13:24:10 crc kubenswrapper[4727]: I1001 13:24:10.887894 4727 scope.go:117] "RemoveContainer" containerID="a30bd551ea9a4d7c812fabaa28232e3edb8b6921fbfb8dedb82bfa5ba7bbea87" Oct 01 13:24:10 crc kubenswrapper[4727]: E1001 13:24:10.888235 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a30bd551ea9a4d7c812fabaa28232e3edb8b6921fbfb8dedb82bfa5ba7bbea87\": container with ID starting with a30bd551ea9a4d7c812fabaa28232e3edb8b6921fbfb8dedb82bfa5ba7bbea87 not found: ID does not exist" containerID="a30bd551ea9a4d7c812fabaa28232e3edb8b6921fbfb8dedb82bfa5ba7bbea87" Oct 01 13:24:10 crc kubenswrapper[4727]: I1001 13:24:10.888288 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a30bd551ea9a4d7c812fabaa28232e3edb8b6921fbfb8dedb82bfa5ba7bbea87"} err="failed to get container status \"a30bd551ea9a4d7c812fabaa28232e3edb8b6921fbfb8dedb82bfa5ba7bbea87\": rpc error: code = NotFound desc = could not find container \"a30bd551ea9a4d7c812fabaa28232e3edb8b6921fbfb8dedb82bfa5ba7bbea87\": container with ID starting with a30bd551ea9a4d7c812fabaa28232e3edb8b6921fbfb8dedb82bfa5ba7bbea87 not found: ID does not exist" Oct 01 13:24:12 crc kubenswrapper[4727]: I1001 13:24:12.384459 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3713fdfb-7dbf-4f1c-bc32-542c815532a7" path="/var/lib/kubelet/pods/3713fdfb-7dbf-4f1c-bc32-542c815532a7/volumes" Oct 01 13:24:16 crc kubenswrapper[4727]: I1001 13:24:16.809573 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2bkrt"] Oct 01 13:24:16 crc kubenswrapper[4727]: E1001 13:24:16.810886 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="becbb755-aeb5-408a-935e-b86cb927f62d" containerName="registry-server" Oct 01 13:24:16 crc kubenswrapper[4727]: I1001 13:24:16.810903 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="becbb755-aeb5-408a-935e-b86cb927f62d" containerName="registry-server" Oct 01 13:24:16 crc kubenswrapper[4727]: E1001 13:24:16.810927 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faedd224-4efd-43f6-b18d-4c126f7a5353" containerName="extract-content" Oct 01 13:24:16 crc kubenswrapper[4727]: I1001 13:24:16.810934 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="faedd224-4efd-43f6-b18d-4c126f7a5353" containerName="extract-content" Oct 01 13:24:16 crc kubenswrapper[4727]: E1001 13:24:16.810945 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3713fdfb-7dbf-4f1c-bc32-542c815532a7" containerName="registry-server" Oct 01 13:24:16 crc kubenswrapper[4727]: I1001 13:24:16.810953 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="3713fdfb-7dbf-4f1c-bc32-542c815532a7" containerName="registry-server" Oct 01 13:24:16 crc kubenswrapper[4727]: E1001 13:24:16.810966 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="becbb755-aeb5-408a-935e-b86cb927f62d" containerName="extract-utilities" Oct 01 13:24:16 crc kubenswrapper[4727]: I1001 13:24:16.810977 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="becbb755-aeb5-408a-935e-b86cb927f62d" containerName="extract-utilities" Oct 01 13:24:16 crc kubenswrapper[4727]: E1001 13:24:16.811014 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3713fdfb-7dbf-4f1c-bc32-542c815532a7" containerName="extract-content" Oct 01 13:24:16 crc kubenswrapper[4727]: I1001 13:24:16.811027 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="3713fdfb-7dbf-4f1c-bc32-542c815532a7" containerName="extract-content" Oct 01 13:24:16 crc kubenswrapper[4727]: E1001 13:24:16.811056 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faedd224-4efd-43f6-b18d-4c126f7a5353" containerName="registry-server" Oct 01 13:24:16 crc kubenswrapper[4727]: I1001 13:24:16.811063 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="faedd224-4efd-43f6-b18d-4c126f7a5353" containerName="registry-server" Oct 01 13:24:16 crc kubenswrapper[4727]: E1001 13:24:16.811088 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3713fdfb-7dbf-4f1c-bc32-542c815532a7" containerName="extract-utilities" Oct 01 13:24:16 crc kubenswrapper[4727]: I1001 13:24:16.811097 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="3713fdfb-7dbf-4f1c-bc32-542c815532a7" containerName="extract-utilities" Oct 01 13:24:16 crc kubenswrapper[4727]: E1001 13:24:16.811124 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="becbb755-aeb5-408a-935e-b86cb927f62d" containerName="extract-content" Oct 01 13:24:16 crc kubenswrapper[4727]: I1001 13:24:16.811131 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="becbb755-aeb5-408a-935e-b86cb927f62d" containerName="extract-content" Oct 01 13:24:16 crc kubenswrapper[4727]: E1001 13:24:16.811152 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faedd224-4efd-43f6-b18d-4c126f7a5353" containerName="extract-utilities" Oct 01 13:24:16 crc kubenswrapper[4727]: I1001 13:24:16.811159 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="faedd224-4efd-43f6-b18d-4c126f7a5353" containerName="extract-utilities" Oct 01 13:24:16 crc kubenswrapper[4727]: I1001 13:24:16.811654 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="becbb755-aeb5-408a-935e-b86cb927f62d" containerName="registry-server" Oct 01 13:24:16 crc kubenswrapper[4727]: I1001 13:24:16.811716 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="3713fdfb-7dbf-4f1c-bc32-542c815532a7" containerName="registry-server" Oct 01 13:24:16 crc kubenswrapper[4727]: I1001 13:24:16.811738 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="faedd224-4efd-43f6-b18d-4c126f7a5353" containerName="registry-server" Oct 01 13:24:16 crc kubenswrapper[4727]: I1001 13:24:16.814569 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2bkrt" Oct 01 13:24:16 crc kubenswrapper[4727]: I1001 13:24:16.853134 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2bkrt"] Oct 01 13:24:16 crc kubenswrapper[4727]: I1001 13:24:16.886507 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ce17cc2-0929-4e49-b0e9-3a76adb0909b-utilities\") pod \"certified-operators-2bkrt\" (UID: \"7ce17cc2-0929-4e49-b0e9-3a76adb0909b\") " pod="openshift-marketplace/certified-operators-2bkrt" Oct 01 13:24:16 crc kubenswrapper[4727]: I1001 13:24:16.886624 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ce17cc2-0929-4e49-b0e9-3a76adb0909b-catalog-content\") pod \"certified-operators-2bkrt\" (UID: \"7ce17cc2-0929-4e49-b0e9-3a76adb0909b\") " pod="openshift-marketplace/certified-operators-2bkrt" Oct 01 13:24:16 crc kubenswrapper[4727]: I1001 13:24:16.886722 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6pxf\" (UniqueName: \"kubernetes.io/projected/7ce17cc2-0929-4e49-b0e9-3a76adb0909b-kube-api-access-w6pxf\") pod \"certified-operators-2bkrt\" (UID: \"7ce17cc2-0929-4e49-b0e9-3a76adb0909b\") " pod="openshift-marketplace/certified-operators-2bkrt" Oct 01 13:24:16 crc kubenswrapper[4727]: I1001 13:24:16.988045 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6pxf\" (UniqueName: \"kubernetes.io/projected/7ce17cc2-0929-4e49-b0e9-3a76adb0909b-kube-api-access-w6pxf\") pod \"certified-operators-2bkrt\" (UID: \"7ce17cc2-0929-4e49-b0e9-3a76adb0909b\") " pod="openshift-marketplace/certified-operators-2bkrt" Oct 01 13:24:16 crc kubenswrapper[4727]: I1001 13:24:16.988188 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ce17cc2-0929-4e49-b0e9-3a76adb0909b-utilities\") pod \"certified-operators-2bkrt\" (UID: \"7ce17cc2-0929-4e49-b0e9-3a76adb0909b\") " pod="openshift-marketplace/certified-operators-2bkrt" Oct 01 13:24:16 crc kubenswrapper[4727]: I1001 13:24:16.989014 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ce17cc2-0929-4e49-b0e9-3a76adb0909b-utilities\") pod \"certified-operators-2bkrt\" (UID: \"7ce17cc2-0929-4e49-b0e9-3a76adb0909b\") " pod="openshift-marketplace/certified-operators-2bkrt" Oct 01 13:24:16 crc kubenswrapper[4727]: I1001 13:24:16.989095 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ce17cc2-0929-4e49-b0e9-3a76adb0909b-catalog-content\") pod \"certified-operators-2bkrt\" (UID: \"7ce17cc2-0929-4e49-b0e9-3a76adb0909b\") " pod="openshift-marketplace/certified-operators-2bkrt" Oct 01 13:24:16 crc kubenswrapper[4727]: I1001 13:24:16.989385 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ce17cc2-0929-4e49-b0e9-3a76adb0909b-catalog-content\") pod \"certified-operators-2bkrt\" (UID: \"7ce17cc2-0929-4e49-b0e9-3a76adb0909b\") " pod="openshift-marketplace/certified-operators-2bkrt" Oct 01 13:24:17 crc kubenswrapper[4727]: I1001 13:24:17.015901 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6pxf\" (UniqueName: \"kubernetes.io/projected/7ce17cc2-0929-4e49-b0e9-3a76adb0909b-kube-api-access-w6pxf\") pod \"certified-operators-2bkrt\" (UID: \"7ce17cc2-0929-4e49-b0e9-3a76adb0909b\") " pod="openshift-marketplace/certified-operators-2bkrt" Oct 01 13:24:17 crc kubenswrapper[4727]: I1001 13:24:17.147798 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2bkrt" Oct 01 13:24:17 crc kubenswrapper[4727]: I1001 13:24:17.695119 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2bkrt"] Oct 01 13:24:17 crc kubenswrapper[4727]: I1001 13:24:17.839245 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bkrt" event={"ID":"7ce17cc2-0929-4e49-b0e9-3a76adb0909b","Type":"ContainerStarted","Data":"e4821561f58044b2cd9fd20b9145d13d47de84f37a46c1ceae3c5b8b8db3c005"} Oct 01 13:24:18 crc kubenswrapper[4727]: I1001 13:24:18.864658 4727 generic.go:334] "Generic (PLEG): container finished" podID="7ce17cc2-0929-4e49-b0e9-3a76adb0909b" containerID="c52ebe2036f46520adc6c754c5100ad5965df04b27ae86c17fcb6e9cb2050489" exitCode=0 Oct 01 13:24:18 crc kubenswrapper[4727]: I1001 13:24:18.864899 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bkrt" event={"ID":"7ce17cc2-0929-4e49-b0e9-3a76adb0909b","Type":"ContainerDied","Data":"c52ebe2036f46520adc6c754c5100ad5965df04b27ae86c17fcb6e9cb2050489"} Oct 01 13:24:18 crc kubenswrapper[4727]: I1001 13:24:18.869959 4727 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:24:19 crc kubenswrapper[4727]: I1001 13:24:19.876679 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bkrt" event={"ID":"7ce17cc2-0929-4e49-b0e9-3a76adb0909b","Type":"ContainerStarted","Data":"9a66cb55307ff8bf399f8b5e12b204a74da55b30b3cf55ce6dceb16007265f3c"} Oct 01 13:24:20 crc kubenswrapper[4727]: I1001 13:24:20.886939 4727 generic.go:334] "Generic (PLEG): container finished" podID="7ce17cc2-0929-4e49-b0e9-3a76adb0909b" containerID="9a66cb55307ff8bf399f8b5e12b204a74da55b30b3cf55ce6dceb16007265f3c" exitCode=0 Oct 01 13:24:20 crc kubenswrapper[4727]: I1001 13:24:20.887012 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bkrt" event={"ID":"7ce17cc2-0929-4e49-b0e9-3a76adb0909b","Type":"ContainerDied","Data":"9a66cb55307ff8bf399f8b5e12b204a74da55b30b3cf55ce6dceb16007265f3c"} Oct 01 13:24:21 crc kubenswrapper[4727]: I1001 13:24:21.900409 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bkrt" event={"ID":"7ce17cc2-0929-4e49-b0e9-3a76adb0909b","Type":"ContainerStarted","Data":"02dd978a33515b22c38de19ae06024ea76fbdd4971427a610f42baf1ba59c875"} Oct 01 13:24:21 crc kubenswrapper[4727]: I1001 13:24:21.916902 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2bkrt" podStartSLOduration=3.205044334 podStartE2EDuration="5.916884689s" podCreationTimestamp="2025-10-01 13:24:16 +0000 UTC" firstStartedPulling="2025-10-01 13:24:18.869743699 +0000 UTC m=+2837.191098536" lastFinishedPulling="2025-10-01 13:24:21.581584054 +0000 UTC m=+2839.902938891" observedRunningTime="2025-10-01 13:24:21.915216136 +0000 UTC m=+2840.236570993" watchObservedRunningTime="2025-10-01 13:24:21.916884689 +0000 UTC m=+2840.238239526" Oct 01 13:24:27 crc kubenswrapper[4727]: I1001 13:24:27.148715 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2bkrt" Oct 01 13:24:27 crc kubenswrapper[4727]: I1001 13:24:27.149103 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2bkrt" Oct 01 13:24:27 crc kubenswrapper[4727]: I1001 13:24:27.204986 4727 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2bkrt" Oct 01 13:24:27 crc kubenswrapper[4727]: I1001 13:24:27.998804 4727 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2bkrt" Oct 01 13:24:28 crc kubenswrapper[4727]: I1001 13:24:28.047193 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2bkrt"] Oct 01 13:24:29 crc kubenswrapper[4727]: I1001 13:24:29.969687 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2bkrt" podUID="7ce17cc2-0929-4e49-b0e9-3a76adb0909b" containerName="registry-server" containerID="cri-o://02dd978a33515b22c38de19ae06024ea76fbdd4971427a610f42baf1ba59c875" gracePeriod=2 Oct 01 13:24:30 crc kubenswrapper[4727]: I1001 13:24:30.510265 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2bkrt" Oct 01 13:24:30 crc kubenswrapper[4727]: I1001 13:24:30.574140 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ce17cc2-0929-4e49-b0e9-3a76adb0909b-utilities\") pod \"7ce17cc2-0929-4e49-b0e9-3a76adb0909b\" (UID: \"7ce17cc2-0929-4e49-b0e9-3a76adb0909b\") " Oct 01 13:24:30 crc kubenswrapper[4727]: I1001 13:24:30.574459 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ce17cc2-0929-4e49-b0e9-3a76adb0909b-catalog-content\") pod \"7ce17cc2-0929-4e49-b0e9-3a76adb0909b\" (UID: \"7ce17cc2-0929-4e49-b0e9-3a76adb0909b\") " Oct 01 13:24:30 crc kubenswrapper[4727]: I1001 13:24:30.574646 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6pxf\" (UniqueName: \"kubernetes.io/projected/7ce17cc2-0929-4e49-b0e9-3a76adb0909b-kube-api-access-w6pxf\") pod \"7ce17cc2-0929-4e49-b0e9-3a76adb0909b\" (UID: \"7ce17cc2-0929-4e49-b0e9-3a76adb0909b\") " Oct 01 13:24:30 crc kubenswrapper[4727]: I1001 13:24:30.575778 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ce17cc2-0929-4e49-b0e9-3a76adb0909b-utilities" (OuterVolumeSpecName: "utilities") pod "7ce17cc2-0929-4e49-b0e9-3a76adb0909b" (UID: "7ce17cc2-0929-4e49-b0e9-3a76adb0909b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:24:30 crc kubenswrapper[4727]: I1001 13:24:30.590051 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ce17cc2-0929-4e49-b0e9-3a76adb0909b-kube-api-access-w6pxf" (OuterVolumeSpecName: "kube-api-access-w6pxf") pod "7ce17cc2-0929-4e49-b0e9-3a76adb0909b" (UID: "7ce17cc2-0929-4e49-b0e9-3a76adb0909b"). InnerVolumeSpecName "kube-api-access-w6pxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:30 crc kubenswrapper[4727]: I1001 13:24:30.678647 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6pxf\" (UniqueName: \"kubernetes.io/projected/7ce17cc2-0929-4e49-b0e9-3a76adb0909b-kube-api-access-w6pxf\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:30 crc kubenswrapper[4727]: I1001 13:24:30.678691 4727 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ce17cc2-0929-4e49-b0e9-3a76adb0909b-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:30 crc kubenswrapper[4727]: I1001 13:24:30.985603 4727 generic.go:334] "Generic (PLEG): container finished" podID="7ce17cc2-0929-4e49-b0e9-3a76adb0909b" containerID="02dd978a33515b22c38de19ae06024ea76fbdd4971427a610f42baf1ba59c875" exitCode=0 Oct 01 13:24:30 crc kubenswrapper[4727]: I1001 13:24:30.986175 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bkrt" event={"ID":"7ce17cc2-0929-4e49-b0e9-3a76adb0909b","Type":"ContainerDied","Data":"02dd978a33515b22c38de19ae06024ea76fbdd4971427a610f42baf1ba59c875"} Oct 01 13:24:30 crc kubenswrapper[4727]: I1001 13:24:30.986217 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2bkrt" event={"ID":"7ce17cc2-0929-4e49-b0e9-3a76adb0909b","Type":"ContainerDied","Data":"e4821561f58044b2cd9fd20b9145d13d47de84f37a46c1ceae3c5b8b8db3c005"} Oct 01 13:24:30 crc kubenswrapper[4727]: I1001 13:24:30.986239 4727 scope.go:117] "RemoveContainer" containerID="02dd978a33515b22c38de19ae06024ea76fbdd4971427a610f42baf1ba59c875" Oct 01 13:24:30 crc kubenswrapper[4727]: I1001 13:24:30.986459 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2bkrt" Oct 01 13:24:31 crc kubenswrapper[4727]: I1001 13:24:31.013962 4727 scope.go:117] "RemoveContainer" containerID="9a66cb55307ff8bf399f8b5e12b204a74da55b30b3cf55ce6dceb16007265f3c" Oct 01 13:24:31 crc kubenswrapper[4727]: I1001 13:24:31.037143 4727 scope.go:117] "RemoveContainer" containerID="c52ebe2036f46520adc6c754c5100ad5965df04b27ae86c17fcb6e9cb2050489" Oct 01 13:24:31 crc kubenswrapper[4727]: I1001 13:24:31.075312 4727 scope.go:117] "RemoveContainer" containerID="02dd978a33515b22c38de19ae06024ea76fbdd4971427a610f42baf1ba59c875" Oct 01 13:24:31 crc kubenswrapper[4727]: E1001 13:24:31.076170 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02dd978a33515b22c38de19ae06024ea76fbdd4971427a610f42baf1ba59c875\": container with ID starting with 02dd978a33515b22c38de19ae06024ea76fbdd4971427a610f42baf1ba59c875 not found: ID does not exist" containerID="02dd978a33515b22c38de19ae06024ea76fbdd4971427a610f42baf1ba59c875" Oct 01 13:24:31 crc kubenswrapper[4727]: I1001 13:24:31.076199 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02dd978a33515b22c38de19ae06024ea76fbdd4971427a610f42baf1ba59c875"} err="failed to get container status \"02dd978a33515b22c38de19ae06024ea76fbdd4971427a610f42baf1ba59c875\": rpc error: code = NotFound desc = could not find container \"02dd978a33515b22c38de19ae06024ea76fbdd4971427a610f42baf1ba59c875\": container with ID starting with 02dd978a33515b22c38de19ae06024ea76fbdd4971427a610f42baf1ba59c875 not found: ID does not exist" Oct 01 13:24:31 crc kubenswrapper[4727]: I1001 13:24:31.076221 4727 scope.go:117] "RemoveContainer" containerID="9a66cb55307ff8bf399f8b5e12b204a74da55b30b3cf55ce6dceb16007265f3c" Oct 01 13:24:31 crc kubenswrapper[4727]: E1001 13:24:31.076490 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a66cb55307ff8bf399f8b5e12b204a74da55b30b3cf55ce6dceb16007265f3c\": container with ID starting with 9a66cb55307ff8bf399f8b5e12b204a74da55b30b3cf55ce6dceb16007265f3c not found: ID does not exist" containerID="9a66cb55307ff8bf399f8b5e12b204a74da55b30b3cf55ce6dceb16007265f3c" Oct 01 13:24:31 crc kubenswrapper[4727]: I1001 13:24:31.076511 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a66cb55307ff8bf399f8b5e12b204a74da55b30b3cf55ce6dceb16007265f3c"} err="failed to get container status \"9a66cb55307ff8bf399f8b5e12b204a74da55b30b3cf55ce6dceb16007265f3c\": rpc error: code = NotFound desc = could not find container \"9a66cb55307ff8bf399f8b5e12b204a74da55b30b3cf55ce6dceb16007265f3c\": container with ID starting with 9a66cb55307ff8bf399f8b5e12b204a74da55b30b3cf55ce6dceb16007265f3c not found: ID does not exist" Oct 01 13:24:31 crc kubenswrapper[4727]: I1001 13:24:31.076525 4727 scope.go:117] "RemoveContainer" containerID="c52ebe2036f46520adc6c754c5100ad5965df04b27ae86c17fcb6e9cb2050489" Oct 01 13:24:31 crc kubenswrapper[4727]: E1001 13:24:31.076791 4727 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c52ebe2036f46520adc6c754c5100ad5965df04b27ae86c17fcb6e9cb2050489\": container with ID starting with c52ebe2036f46520adc6c754c5100ad5965df04b27ae86c17fcb6e9cb2050489 not found: ID does not exist" containerID="c52ebe2036f46520adc6c754c5100ad5965df04b27ae86c17fcb6e9cb2050489" Oct 01 13:24:31 crc kubenswrapper[4727]: I1001 13:24:31.076811 4727 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c52ebe2036f46520adc6c754c5100ad5965df04b27ae86c17fcb6e9cb2050489"} err="failed to get container status \"c52ebe2036f46520adc6c754c5100ad5965df04b27ae86c17fcb6e9cb2050489\": rpc error: code = NotFound desc = could not find container \"c52ebe2036f46520adc6c754c5100ad5965df04b27ae86c17fcb6e9cb2050489\": container with ID starting with c52ebe2036f46520adc6c754c5100ad5965df04b27ae86c17fcb6e9cb2050489 not found: ID does not exist" Oct 01 13:24:31 crc kubenswrapper[4727]: I1001 13:24:31.129561 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ce17cc2-0929-4e49-b0e9-3a76adb0909b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ce17cc2-0929-4e49-b0e9-3a76adb0909b" (UID: "7ce17cc2-0929-4e49-b0e9-3a76adb0909b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:24:31 crc kubenswrapper[4727]: I1001 13:24:31.188181 4727 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ce17cc2-0929-4e49-b0e9-3a76adb0909b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:31 crc kubenswrapper[4727]: I1001 13:24:31.333862 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2bkrt"] Oct 01 13:24:31 crc kubenswrapper[4727]: I1001 13:24:31.344404 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2bkrt"] Oct 01 13:24:32 crc kubenswrapper[4727]: I1001 13:24:32.383499 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ce17cc2-0929-4e49-b0e9-3a76adb0909b" path="/var/lib/kubelet/pods/7ce17cc2-0929-4e49-b0e9-3a76adb0909b/volumes" Oct 01 13:24:39 crc kubenswrapper[4727]: I1001 13:24:39.061961 4727 generic.go:334] "Generic (PLEG): container finished" podID="e4381656-3c9d-44d0-a003-01b7d8b91b19" containerID="3ed39f8cbb5272e90e822965dee6caf1bbf712401b23ad0b558642ced7ab89ee" exitCode=0 Oct 01 13:24:39 crc kubenswrapper[4727]: I1001 13:24:39.062065 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6wxt/crc-debug-m798g" event={"ID":"e4381656-3c9d-44d0-a003-01b7d8b91b19","Type":"ContainerDied","Data":"3ed39f8cbb5272e90e822965dee6caf1bbf712401b23ad0b558642ced7ab89ee"} Oct 01 13:24:40 crc kubenswrapper[4727]: I1001 13:24:40.179042 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6wxt/crc-debug-m798g" Oct 01 13:24:40 crc kubenswrapper[4727]: I1001 13:24:40.215976 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d6wxt/crc-debug-m798g"] Oct 01 13:24:40 crc kubenswrapper[4727]: I1001 13:24:40.226430 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d6wxt/crc-debug-m798g"] Oct 01 13:24:40 crc kubenswrapper[4727]: I1001 13:24:40.293387 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4381656-3c9d-44d0-a003-01b7d8b91b19-host\") pod \"e4381656-3c9d-44d0-a003-01b7d8b91b19\" (UID: \"e4381656-3c9d-44d0-a003-01b7d8b91b19\") " Oct 01 13:24:40 crc kubenswrapper[4727]: I1001 13:24:40.293538 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s46hl\" (UniqueName: \"kubernetes.io/projected/e4381656-3c9d-44d0-a003-01b7d8b91b19-kube-api-access-s46hl\") pod \"e4381656-3c9d-44d0-a003-01b7d8b91b19\" (UID: \"e4381656-3c9d-44d0-a003-01b7d8b91b19\") " Oct 01 13:24:40 crc kubenswrapper[4727]: I1001 13:24:40.293536 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4381656-3c9d-44d0-a003-01b7d8b91b19-host" (OuterVolumeSpecName: "host") pod "e4381656-3c9d-44d0-a003-01b7d8b91b19" (UID: "e4381656-3c9d-44d0-a003-01b7d8b91b19"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:24:40 crc kubenswrapper[4727]: I1001 13:24:40.294042 4727 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4381656-3c9d-44d0-a003-01b7d8b91b19-host\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:40 crc kubenswrapper[4727]: I1001 13:24:40.302351 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4381656-3c9d-44d0-a003-01b7d8b91b19-kube-api-access-s46hl" (OuterVolumeSpecName: "kube-api-access-s46hl") pod "e4381656-3c9d-44d0-a003-01b7d8b91b19" (UID: "e4381656-3c9d-44d0-a003-01b7d8b91b19"). InnerVolumeSpecName "kube-api-access-s46hl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:40 crc kubenswrapper[4727]: I1001 13:24:40.385419 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4381656-3c9d-44d0-a003-01b7d8b91b19" path="/var/lib/kubelet/pods/e4381656-3c9d-44d0-a003-01b7d8b91b19/volumes" Oct 01 13:24:40 crc kubenswrapper[4727]: I1001 13:24:40.397521 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s46hl\" (UniqueName: \"kubernetes.io/projected/e4381656-3c9d-44d0-a003-01b7d8b91b19-kube-api-access-s46hl\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:41 crc kubenswrapper[4727]: I1001 13:24:41.081612 4727 scope.go:117] "RemoveContainer" containerID="3ed39f8cbb5272e90e822965dee6caf1bbf712401b23ad0b558642ced7ab89ee" Oct 01 13:24:41 crc kubenswrapper[4727]: I1001 13:24:41.081812 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6wxt/crc-debug-m798g" Oct 01 13:24:41 crc kubenswrapper[4727]: I1001 13:24:41.414709 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d6wxt/crc-debug-vn9lq"] Oct 01 13:24:41 crc kubenswrapper[4727]: E1001 13:24:41.415684 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ce17cc2-0929-4e49-b0e9-3a76adb0909b" containerName="extract-content" Oct 01 13:24:41 crc kubenswrapper[4727]: I1001 13:24:41.415703 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ce17cc2-0929-4e49-b0e9-3a76adb0909b" containerName="extract-content" Oct 01 13:24:41 crc kubenswrapper[4727]: E1001 13:24:41.415737 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ce17cc2-0929-4e49-b0e9-3a76adb0909b" containerName="extract-utilities" Oct 01 13:24:41 crc kubenswrapper[4727]: I1001 13:24:41.415745 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ce17cc2-0929-4e49-b0e9-3a76adb0909b" containerName="extract-utilities" Oct 01 13:24:41 crc kubenswrapper[4727]: E1001 13:24:41.415761 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ce17cc2-0929-4e49-b0e9-3a76adb0909b" containerName="registry-server" Oct 01 13:24:41 crc kubenswrapper[4727]: I1001 13:24:41.415768 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ce17cc2-0929-4e49-b0e9-3a76adb0909b" containerName="registry-server" Oct 01 13:24:41 crc kubenswrapper[4727]: E1001 13:24:41.415853 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4381656-3c9d-44d0-a003-01b7d8b91b19" containerName="container-00" Oct 01 13:24:41 crc kubenswrapper[4727]: I1001 13:24:41.415864 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4381656-3c9d-44d0-a003-01b7d8b91b19" containerName="container-00" Oct 01 13:24:41 crc kubenswrapper[4727]: I1001 13:24:41.416841 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4381656-3c9d-44d0-a003-01b7d8b91b19" containerName="container-00" Oct 01 13:24:41 crc kubenswrapper[4727]: I1001 13:24:41.416865 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ce17cc2-0929-4e49-b0e9-3a76adb0909b" containerName="registry-server" Oct 01 13:24:41 crc kubenswrapper[4727]: I1001 13:24:41.417881 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6wxt/crc-debug-vn9lq" Oct 01 13:24:41 crc kubenswrapper[4727]: I1001 13:24:41.527744 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4630a2f7-1fbe-47d7-9e41-7cc069599af8-host\") pod \"crc-debug-vn9lq\" (UID: \"4630a2f7-1fbe-47d7-9e41-7cc069599af8\") " pod="openshift-must-gather-d6wxt/crc-debug-vn9lq" Oct 01 13:24:41 crc kubenswrapper[4727]: I1001 13:24:41.528285 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-554h8\" (UniqueName: \"kubernetes.io/projected/4630a2f7-1fbe-47d7-9e41-7cc069599af8-kube-api-access-554h8\") pod \"crc-debug-vn9lq\" (UID: \"4630a2f7-1fbe-47d7-9e41-7cc069599af8\") " pod="openshift-must-gather-d6wxt/crc-debug-vn9lq" Oct 01 13:24:41 crc kubenswrapper[4727]: I1001 13:24:41.631244 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4630a2f7-1fbe-47d7-9e41-7cc069599af8-host\") pod \"crc-debug-vn9lq\" (UID: \"4630a2f7-1fbe-47d7-9e41-7cc069599af8\") " pod="openshift-must-gather-d6wxt/crc-debug-vn9lq" Oct 01 13:24:41 crc kubenswrapper[4727]: I1001 13:24:41.631306 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-554h8\" (UniqueName: \"kubernetes.io/projected/4630a2f7-1fbe-47d7-9e41-7cc069599af8-kube-api-access-554h8\") pod \"crc-debug-vn9lq\" (UID: \"4630a2f7-1fbe-47d7-9e41-7cc069599af8\") " pod="openshift-must-gather-d6wxt/crc-debug-vn9lq" Oct 01 13:24:41 crc kubenswrapper[4727]: I1001 13:24:41.631447 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4630a2f7-1fbe-47d7-9e41-7cc069599af8-host\") pod \"crc-debug-vn9lq\" (UID: \"4630a2f7-1fbe-47d7-9e41-7cc069599af8\") " pod="openshift-must-gather-d6wxt/crc-debug-vn9lq" Oct 01 13:24:41 crc kubenswrapper[4727]: I1001 13:24:41.653649 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-554h8\" (UniqueName: \"kubernetes.io/projected/4630a2f7-1fbe-47d7-9e41-7cc069599af8-kube-api-access-554h8\") pod \"crc-debug-vn9lq\" (UID: \"4630a2f7-1fbe-47d7-9e41-7cc069599af8\") " pod="openshift-must-gather-d6wxt/crc-debug-vn9lq" Oct 01 13:24:41 crc kubenswrapper[4727]: I1001 13:24:41.738175 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6wxt/crc-debug-vn9lq" Oct 01 13:24:41 crc kubenswrapper[4727]: W1001 13:24:41.768286 4727 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4630a2f7_1fbe_47d7_9e41_7cc069599af8.slice/crio-28f8185e91547902237dc41593d4c887e21cdc12bee60c3b57848a1eb0797708 WatchSource:0}: Error finding container 28f8185e91547902237dc41593d4c887e21cdc12bee60c3b57848a1eb0797708: Status 404 returned error can't find the container with id 28f8185e91547902237dc41593d4c887e21cdc12bee60c3b57848a1eb0797708 Oct 01 13:24:42 crc kubenswrapper[4727]: I1001 13:24:42.094510 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6wxt/crc-debug-vn9lq" event={"ID":"4630a2f7-1fbe-47d7-9e41-7cc069599af8","Type":"ContainerStarted","Data":"49ff5318c172c5fd587522c95abea67b0665b1558f83b0953781c0f57bc1ccbf"} Oct 01 13:24:42 crc kubenswrapper[4727]: I1001 13:24:42.094562 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6wxt/crc-debug-vn9lq" event={"ID":"4630a2f7-1fbe-47d7-9e41-7cc069599af8","Type":"ContainerStarted","Data":"28f8185e91547902237dc41593d4c887e21cdc12bee60c3b57848a1eb0797708"} Oct 01 13:24:42 crc kubenswrapper[4727]: I1001 13:24:42.115161 4727 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d6wxt/crc-debug-vn9lq" podStartSLOduration=1.11514172 podStartE2EDuration="1.11514172s" podCreationTimestamp="2025-10-01 13:24:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:24:42.109548814 +0000 UTC m=+2860.430903661" watchObservedRunningTime="2025-10-01 13:24:42.11514172 +0000 UTC m=+2860.436496547" Oct 01 13:24:43 crc kubenswrapper[4727]: I1001 13:24:43.105916 4727 generic.go:334] "Generic (PLEG): container finished" podID="4630a2f7-1fbe-47d7-9e41-7cc069599af8" containerID="49ff5318c172c5fd587522c95abea67b0665b1558f83b0953781c0f57bc1ccbf" exitCode=0 Oct 01 13:24:43 crc kubenswrapper[4727]: I1001 13:24:43.105965 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6wxt/crc-debug-vn9lq" event={"ID":"4630a2f7-1fbe-47d7-9e41-7cc069599af8","Type":"ContainerDied","Data":"49ff5318c172c5fd587522c95abea67b0665b1558f83b0953781c0f57bc1ccbf"} Oct 01 13:24:44 crc kubenswrapper[4727]: I1001 13:24:44.233122 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6wxt/crc-debug-vn9lq" Oct 01 13:24:44 crc kubenswrapper[4727]: I1001 13:24:44.375512 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4630a2f7-1fbe-47d7-9e41-7cc069599af8-host\") pod \"4630a2f7-1fbe-47d7-9e41-7cc069599af8\" (UID: \"4630a2f7-1fbe-47d7-9e41-7cc069599af8\") " Oct 01 13:24:44 crc kubenswrapper[4727]: I1001 13:24:44.375583 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4630a2f7-1fbe-47d7-9e41-7cc069599af8-host" (OuterVolumeSpecName: "host") pod "4630a2f7-1fbe-47d7-9e41-7cc069599af8" (UID: "4630a2f7-1fbe-47d7-9e41-7cc069599af8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:24:44 crc kubenswrapper[4727]: I1001 13:24:44.375671 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-554h8\" (UniqueName: \"kubernetes.io/projected/4630a2f7-1fbe-47d7-9e41-7cc069599af8-kube-api-access-554h8\") pod \"4630a2f7-1fbe-47d7-9e41-7cc069599af8\" (UID: \"4630a2f7-1fbe-47d7-9e41-7cc069599af8\") " Oct 01 13:24:44 crc kubenswrapper[4727]: I1001 13:24:44.376288 4727 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4630a2f7-1fbe-47d7-9e41-7cc069599af8-host\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:44 crc kubenswrapper[4727]: I1001 13:24:44.383605 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4630a2f7-1fbe-47d7-9e41-7cc069599af8-kube-api-access-554h8" (OuterVolumeSpecName: "kube-api-access-554h8") pod "4630a2f7-1fbe-47d7-9e41-7cc069599af8" (UID: "4630a2f7-1fbe-47d7-9e41-7cc069599af8"). InnerVolumeSpecName "kube-api-access-554h8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:44 crc kubenswrapper[4727]: I1001 13:24:44.480528 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-554h8\" (UniqueName: \"kubernetes.io/projected/4630a2f7-1fbe-47d7-9e41-7cc069599af8-kube-api-access-554h8\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:45 crc kubenswrapper[4727]: I1001 13:24:45.130651 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6wxt/crc-debug-vn9lq" event={"ID":"4630a2f7-1fbe-47d7-9e41-7cc069599af8","Type":"ContainerDied","Data":"28f8185e91547902237dc41593d4c887e21cdc12bee60c3b57848a1eb0797708"} Oct 01 13:24:45 crc kubenswrapper[4727]: I1001 13:24:45.130704 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28f8185e91547902237dc41593d4c887e21cdc12bee60c3b57848a1eb0797708" Oct 01 13:24:45 crc kubenswrapper[4727]: I1001 13:24:45.130774 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6wxt/crc-debug-vn9lq" Oct 01 13:24:48 crc kubenswrapper[4727]: I1001 13:24:48.867539 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d6wxt/crc-debug-vn9lq"] Oct 01 13:24:48 crc kubenswrapper[4727]: I1001 13:24:48.876573 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d6wxt/crc-debug-vn9lq"] Oct 01 13:24:50 crc kubenswrapper[4727]: I1001 13:24:50.103507 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d6wxt/crc-debug-b7z46"] Oct 01 13:24:50 crc kubenswrapper[4727]: E1001 13:24:50.105442 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4630a2f7-1fbe-47d7-9e41-7cc069599af8" containerName="container-00" Oct 01 13:24:50 crc kubenswrapper[4727]: I1001 13:24:50.105498 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="4630a2f7-1fbe-47d7-9e41-7cc069599af8" containerName="container-00" Oct 01 13:24:50 crc kubenswrapper[4727]: I1001 13:24:50.105802 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="4630a2f7-1fbe-47d7-9e41-7cc069599af8" containerName="container-00" Oct 01 13:24:50 crc kubenswrapper[4727]: I1001 13:24:50.106575 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6wxt/crc-debug-b7z46" Oct 01 13:24:50 crc kubenswrapper[4727]: I1001 13:24:50.294344 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44876bc2-f5ce-41d6-b972-360d61acc6fb-host\") pod \"crc-debug-b7z46\" (UID: \"44876bc2-f5ce-41d6-b972-360d61acc6fb\") " pod="openshift-must-gather-d6wxt/crc-debug-b7z46" Oct 01 13:24:50 crc kubenswrapper[4727]: I1001 13:24:50.294830 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n8kr\" (UniqueName: \"kubernetes.io/projected/44876bc2-f5ce-41d6-b972-360d61acc6fb-kube-api-access-8n8kr\") pod \"crc-debug-b7z46\" (UID: \"44876bc2-f5ce-41d6-b972-360d61acc6fb\") " pod="openshift-must-gather-d6wxt/crc-debug-b7z46" Oct 01 13:24:50 crc kubenswrapper[4727]: I1001 13:24:50.393400 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4630a2f7-1fbe-47d7-9e41-7cc069599af8" path="/var/lib/kubelet/pods/4630a2f7-1fbe-47d7-9e41-7cc069599af8/volumes" Oct 01 13:24:50 crc kubenswrapper[4727]: I1001 13:24:50.397387 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n8kr\" (UniqueName: \"kubernetes.io/projected/44876bc2-f5ce-41d6-b972-360d61acc6fb-kube-api-access-8n8kr\") pod \"crc-debug-b7z46\" (UID: \"44876bc2-f5ce-41d6-b972-360d61acc6fb\") " pod="openshift-must-gather-d6wxt/crc-debug-b7z46" Oct 01 13:24:50 crc kubenswrapper[4727]: I1001 13:24:50.397710 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44876bc2-f5ce-41d6-b972-360d61acc6fb-host\") pod \"crc-debug-b7z46\" (UID: \"44876bc2-f5ce-41d6-b972-360d61acc6fb\") " pod="openshift-must-gather-d6wxt/crc-debug-b7z46" Oct 01 13:24:50 crc kubenswrapper[4727]: I1001 13:24:50.397867 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44876bc2-f5ce-41d6-b972-360d61acc6fb-host\") pod \"crc-debug-b7z46\" (UID: \"44876bc2-f5ce-41d6-b972-360d61acc6fb\") " pod="openshift-must-gather-d6wxt/crc-debug-b7z46" Oct 01 13:24:50 crc kubenswrapper[4727]: I1001 13:24:50.427590 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n8kr\" (UniqueName: \"kubernetes.io/projected/44876bc2-f5ce-41d6-b972-360d61acc6fb-kube-api-access-8n8kr\") pod \"crc-debug-b7z46\" (UID: \"44876bc2-f5ce-41d6-b972-360d61acc6fb\") " pod="openshift-must-gather-d6wxt/crc-debug-b7z46" Oct 01 13:24:50 crc kubenswrapper[4727]: I1001 13:24:50.725243 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6wxt/crc-debug-b7z46" Oct 01 13:24:51 crc kubenswrapper[4727]: I1001 13:24:51.210749 4727 generic.go:334] "Generic (PLEG): container finished" podID="44876bc2-f5ce-41d6-b972-360d61acc6fb" containerID="420fd61f3734518eed153cfba4606e234efa05d38837a87073e865931b70f1f1" exitCode=0 Oct 01 13:24:51 crc kubenswrapper[4727]: I1001 13:24:51.211235 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6wxt/crc-debug-b7z46" event={"ID":"44876bc2-f5ce-41d6-b972-360d61acc6fb","Type":"ContainerDied","Data":"420fd61f3734518eed153cfba4606e234efa05d38837a87073e865931b70f1f1"} Oct 01 13:24:51 crc kubenswrapper[4727]: I1001 13:24:51.211332 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6wxt/crc-debug-b7z46" event={"ID":"44876bc2-f5ce-41d6-b972-360d61acc6fb","Type":"ContainerStarted","Data":"1f90f882b707f6e8ceb5fed6446a178e76d7268bdcaa6f60e3f6b636433f7012"} Oct 01 13:24:51 crc kubenswrapper[4727]: I1001 13:24:51.257624 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d6wxt/crc-debug-b7z46"] Oct 01 13:24:51 crc kubenswrapper[4727]: I1001 13:24:51.265711 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d6wxt/crc-debug-b7z46"] Oct 01 13:24:52 crc kubenswrapper[4727]: I1001 13:24:52.324265 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6wxt/crc-debug-b7z46" Oct 01 13:24:52 crc kubenswrapper[4727]: I1001 13:24:52.441092 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44876bc2-f5ce-41d6-b972-360d61acc6fb-host\") pod \"44876bc2-f5ce-41d6-b972-360d61acc6fb\" (UID: \"44876bc2-f5ce-41d6-b972-360d61acc6fb\") " Oct 01 13:24:52 crc kubenswrapper[4727]: I1001 13:24:52.441186 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44876bc2-f5ce-41d6-b972-360d61acc6fb-host" (OuterVolumeSpecName: "host") pod "44876bc2-f5ce-41d6-b972-360d61acc6fb" (UID: "44876bc2-f5ce-41d6-b972-360d61acc6fb"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:24:52 crc kubenswrapper[4727]: I1001 13:24:52.441501 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n8kr\" (UniqueName: \"kubernetes.io/projected/44876bc2-f5ce-41d6-b972-360d61acc6fb-kube-api-access-8n8kr\") pod \"44876bc2-f5ce-41d6-b972-360d61acc6fb\" (UID: \"44876bc2-f5ce-41d6-b972-360d61acc6fb\") " Oct 01 13:24:52 crc kubenswrapper[4727]: I1001 13:24:52.443629 4727 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/44876bc2-f5ce-41d6-b972-360d61acc6fb-host\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:52 crc kubenswrapper[4727]: I1001 13:24:52.461213 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44876bc2-f5ce-41d6-b972-360d61acc6fb-kube-api-access-8n8kr" (OuterVolumeSpecName: "kube-api-access-8n8kr") pod "44876bc2-f5ce-41d6-b972-360d61acc6fb" (UID: "44876bc2-f5ce-41d6-b972-360d61acc6fb"). InnerVolumeSpecName "kube-api-access-8n8kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:52 crc kubenswrapper[4727]: I1001 13:24:52.547917 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n8kr\" (UniqueName: \"kubernetes.io/projected/44876bc2-f5ce-41d6-b972-360d61acc6fb-kube-api-access-8n8kr\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:52 crc kubenswrapper[4727]: I1001 13:24:52.892217 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-p4w7m_21ca64fa-6683-4cac-97cd-32944d87bced/kube-rbac-proxy/0.log" Oct 01 13:24:52 crc kubenswrapper[4727]: I1001 13:24:52.998068 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-p4w7m_21ca64fa-6683-4cac-97cd-32944d87bced/manager/0.log" Oct 01 13:24:53 crc kubenswrapper[4727]: I1001 13:24:53.179867 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-j9gmz_8ed41f7a-f315-407e-b7a8-c5dc3fef764a/manager/0.log" Oct 01 13:24:53 crc kubenswrapper[4727]: I1001 13:24:53.192227 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-j9gmz_8ed41f7a-f315-407e-b7a8-c5dc3fef764a/kube-rbac-proxy/0.log" Oct 01 13:24:53 crc kubenswrapper[4727]: I1001 13:24:53.233505 4727 scope.go:117] "RemoveContainer" containerID="420fd61f3734518eed153cfba4606e234efa05d38837a87073e865931b70f1f1" Oct 01 13:24:53 crc kubenswrapper[4727]: I1001 13:24:53.233556 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6wxt/crc-debug-b7z46" Oct 01 13:24:53 crc kubenswrapper[4727]: I1001 13:24:53.403286 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-2mrqm_71ad1cc3-a660-4a74-b15d-b1c7e03bf785/kube-rbac-proxy/0.log" Oct 01 13:24:53 crc kubenswrapper[4727]: I1001 13:24:53.433221 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-2mrqm_71ad1cc3-a660-4a74-b15d-b1c7e03bf785/manager/0.log" Oct 01 13:24:53 crc kubenswrapper[4727]: I1001 13:24:53.493949 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm_3159c1e1-b299-4837-bb69-06e886f09112/util/0.log" Oct 01 13:24:53 crc kubenswrapper[4727]: I1001 13:24:53.704315 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm_3159c1e1-b299-4837-bb69-06e886f09112/util/0.log" Oct 01 13:24:53 crc kubenswrapper[4727]: I1001 13:24:53.740590 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm_3159c1e1-b299-4837-bb69-06e886f09112/pull/0.log" Oct 01 13:24:53 crc kubenswrapper[4727]: I1001 13:24:53.796593 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm_3159c1e1-b299-4837-bb69-06e886f09112/pull/0.log" Oct 01 13:24:53 crc kubenswrapper[4727]: I1001 13:24:53.959080 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm_3159c1e1-b299-4837-bb69-06e886f09112/pull/0.log" Oct 01 13:24:53 crc kubenswrapper[4727]: I1001 13:24:53.985014 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm_3159c1e1-b299-4837-bb69-06e886f09112/util/0.log" Oct 01 13:24:54 crc kubenswrapper[4727]: I1001 13:24:54.043586 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e71a2713daad49b92cdcd48dfa02949deb026e4e9584cde5b63a6608448fghm_3159c1e1-b299-4837-bb69-06e886f09112/extract/0.log" Oct 01 13:24:54 crc kubenswrapper[4727]: I1001 13:24:54.192826 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-82kh4_583d4e80-fb09-4853-8d80-9df371bf58e6/kube-rbac-proxy/0.log" Oct 01 13:24:54 crc kubenswrapper[4727]: I1001 13:24:54.309318 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-82kh4_583d4e80-fb09-4853-8d80-9df371bf58e6/manager/0.log" Oct 01 13:24:54 crc kubenswrapper[4727]: I1001 13:24:54.320547 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-dvddt_dc459bd0-7d95-4fe6-981a-7afdb763efa8/kube-rbac-proxy/0.log" Oct 01 13:24:54 crc kubenswrapper[4727]: I1001 13:24:54.386126 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44876bc2-f5ce-41d6-b972-360d61acc6fb" path="/var/lib/kubelet/pods/44876bc2-f5ce-41d6-b972-360d61acc6fb/volumes" Oct 01 13:24:54 crc kubenswrapper[4727]: I1001 13:24:54.456911 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-dvddt_dc459bd0-7d95-4fe6-981a-7afdb763efa8/manager/0.log" Oct 01 13:24:54 crc kubenswrapper[4727]: I1001 13:24:54.511448 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-6m8j5_4409e813-a7ba-440c-8ef3-22ecac8a1093/kube-rbac-proxy/0.log" Oct 01 13:24:54 crc kubenswrapper[4727]: I1001 13:24:54.554143 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-6m8j5_4409e813-a7ba-440c-8ef3-22ecac8a1093/manager/0.log" Oct 01 13:24:54 crc kubenswrapper[4727]: I1001 13:24:54.771491 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-c88bk_99ea0596-d1a9-434c-a176-0b4a244ecc83/kube-rbac-proxy/0.log" Oct 01 13:24:54 crc kubenswrapper[4727]: I1001 13:24:54.983787 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-c88bk_99ea0596-d1a9-434c-a176-0b4a244ecc83/manager/0.log" Oct 01 13:24:55 crc kubenswrapper[4727]: I1001 13:24:55.009851 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-6zhw5_47e3cb37-ce4b-4280-9863-ad6a95b1347c/kube-rbac-proxy/0.log" Oct 01 13:24:55 crc kubenswrapper[4727]: I1001 13:24:55.049248 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-6zhw5_47e3cb37-ce4b-4280-9863-ad6a95b1347c/manager/0.log" Oct 01 13:24:55 crc kubenswrapper[4727]: I1001 13:24:55.178122 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-k7cf5_a5c6c947-8392-4385-9448-ca70c91635e6/kube-rbac-proxy/0.log" Oct 01 13:24:55 crc kubenswrapper[4727]: I1001 13:24:55.263245 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-vgqst_52bc77fe-21ba-4ac8-9fca-531e3c80432a/kube-rbac-proxy/0.log" Oct 01 13:24:55 crc kubenswrapper[4727]: I1001 13:24:55.328516 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-k7cf5_a5c6c947-8392-4385-9448-ca70c91635e6/manager/0.log" Oct 01 13:24:55 crc kubenswrapper[4727]: I1001 13:24:55.434844 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-vgqst_52bc77fe-21ba-4ac8-9fca-531e3c80432a/manager/0.log" Oct 01 13:24:55 crc kubenswrapper[4727]: I1001 13:24:55.561992 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-7vqrx_f31d7fb8-1ac0-4fd0-aa18-4cb9e879b506/kube-rbac-proxy/0.log" Oct 01 13:24:55 crc kubenswrapper[4727]: I1001 13:24:55.616145 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-7vqrx_f31d7fb8-1ac0-4fd0-aa18-4cb9e879b506/manager/0.log" Oct 01 13:24:55 crc kubenswrapper[4727]: I1001 13:24:55.770587 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-lkdzs_5e40e563-9455-43dd-a3ef-e442010c31a4/kube-rbac-proxy/0.log" Oct 01 13:24:55 crc kubenswrapper[4727]: I1001 13:24:55.875711 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-lkdzs_5e40e563-9455-43dd-a3ef-e442010c31a4/manager/0.log" Oct 01 13:24:55 crc kubenswrapper[4727]: I1001 13:24:55.930882 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-dt9z7_18ea0de4-19a4-4417-a13e-bec65f0cfc31/kube-rbac-proxy/0.log" Oct 01 13:24:56 crc kubenswrapper[4727]: I1001 13:24:56.079330 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-x8pr2_7c69585d-d708-4863-9cdf-bace662d6658/kube-rbac-proxy/0.log" Oct 01 13:24:56 crc kubenswrapper[4727]: I1001 13:24:56.126087 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-dt9z7_18ea0de4-19a4-4417-a13e-bec65f0cfc31/manager/0.log" Oct 01 13:24:56 crc kubenswrapper[4727]: I1001 13:24:56.197336 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-x8pr2_7c69585d-d708-4863-9cdf-bace662d6658/manager/0.log" Oct 01 13:24:56 crc kubenswrapper[4727]: I1001 13:24:56.366844 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8c9z78p_4924da7d-07e9-4378-9965-c3e85c3018c8/kube-rbac-proxy/0.log" Oct 01 13:24:56 crc kubenswrapper[4727]: I1001 13:24:56.378801 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8c9z78p_4924da7d-07e9-4378-9965-c3e85c3018c8/manager/0.log" Oct 01 13:24:56 crc kubenswrapper[4727]: I1001 13:24:56.650589 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5db568f97f-zfnfz_d6bc6c09-3c9e-4de0-bf11-239a93867c74/kube-rbac-proxy/0.log" Oct 01 13:24:56 crc kubenswrapper[4727]: I1001 13:24:56.765112 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-c85b59bf-qns8g_007fa737-02ad-4360-8e6f-245b87f1c91d/kube-rbac-proxy/0.log" Oct 01 13:24:56 crc kubenswrapper[4727]: I1001 13:24:56.907385 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-nlvhp_ea50a03d-3d5c-4e61-9703-6a8980e33a1f/registry-server/0.log" Oct 01 13:24:56 crc kubenswrapper[4727]: I1001 13:24:56.932508 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-c85b59bf-qns8g_007fa737-02ad-4360-8e6f-245b87f1c91d/operator/0.log" Oct 01 13:24:57 crc kubenswrapper[4727]: I1001 13:24:57.186159 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-wbl5k_b1322ef4-b813-41a1-a851-d9e96e4cf7ef/kube-rbac-proxy/0.log" Oct 01 13:24:57 crc kubenswrapper[4727]: I1001 13:24:57.323981 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-wbl5k_b1322ef4-b813-41a1-a851-d9e96e4cf7ef/manager/0.log" Oct 01 13:24:57 crc kubenswrapper[4727]: I1001 13:24:57.386683 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-6l8fp_7f874b80-31cc-4c3a-9506-999fb72deac5/kube-rbac-proxy/0.log" Oct 01 13:24:57 crc kubenswrapper[4727]: I1001 13:24:57.523964 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-6l8fp_7f874b80-31cc-4c3a-9506-999fb72deac5/manager/0.log" Oct 01 13:24:57 crc kubenswrapper[4727]: I1001 13:24:57.614024 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-kbtpt_d823b105-b073-44a4-9a1f-eb067b981295/operator/0.log" Oct 01 13:24:57 crc kubenswrapper[4727]: I1001 13:24:57.825426 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-czvw6_43e69ea0-ecf5-40a9-ae20-94ac949ebfeb/kube-rbac-proxy/0.log" Oct 01 13:24:57 crc kubenswrapper[4727]: I1001 13:24:57.833168 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-czvw6_43e69ea0-ecf5-40a9-ae20-94ac949ebfeb/manager/0.log" Oct 01 13:24:57 crc kubenswrapper[4727]: I1001 13:24:57.990913 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b5b89c9dd-6c9pp_cc1db3cf-e8c2-4209-9d01-bb825fb693d6/kube-rbac-proxy/0.log" Oct 01 13:24:58 crc kubenswrapper[4727]: I1001 13:24:58.024317 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5db568f97f-zfnfz_d6bc6c09-3c9e-4de0-bf11-239a93867c74/manager/0.log" Oct 01 13:24:58 crc kubenswrapper[4727]: I1001 13:24:58.134900 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-smzzs_cd54773a-d526-46e2-a6bd-703886de898c/kube-rbac-proxy/0.log" Oct 01 13:24:58 crc kubenswrapper[4727]: I1001 13:24:58.137376 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b5b89c9dd-6c9pp_cc1db3cf-e8c2-4209-9d01-bb825fb693d6/manager/0.log" Oct 01 13:24:58 crc kubenswrapper[4727]: I1001 13:24:58.225611 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-smzzs_cd54773a-d526-46e2-a6bd-703886de898c/manager/0.log" Oct 01 13:24:58 crc kubenswrapper[4727]: I1001 13:24:58.302269 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9957f54f-kcsrk_2325c2e9-2f53-48b4-8dfb-bc1089a0caab/kube-rbac-proxy/0.log" Oct 01 13:24:58 crc kubenswrapper[4727]: I1001 13:24:58.347469 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9957f54f-kcsrk_2325c2e9-2f53-48b4-8dfb-bc1089a0caab/manager/0.log" Oct 01 13:25:13 crc kubenswrapper[4727]: I1001 13:25:13.530297 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-xkkcc_d9815009-494f-4e87-9d55-da93dc61b078/control-plane-machine-set-operator/0.log" Oct 01 13:25:13 crc kubenswrapper[4727]: I1001 13:25:13.731291 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-psvph_e419dff2-2c6a-4c89-8d99-0374397903b1/machine-api-operator/0.log" Oct 01 13:25:13 crc kubenswrapper[4727]: I1001 13:25:13.746416 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-psvph_e419dff2-2c6a-4c89-8d99-0374397903b1/kube-rbac-proxy/0.log" Oct 01 13:25:27 crc kubenswrapper[4727]: I1001 13:25:27.403221 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-28n2t_81c18dc6-8f66-4bc2-9b57-867976fab5d8/cert-manager-controller/0.log" Oct 01 13:25:27 crc kubenswrapper[4727]: I1001 13:25:27.661053 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-xh4gb_c61e6a7c-4c30-46aa-a082-53ac21575230/cert-manager-cainjector/0.log" Oct 01 13:25:27 crc kubenswrapper[4727]: I1001 13:25:27.694513 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-sbfpp_5a9acff3-95f2-4c83-84a8-abe0aa97789c/cert-manager-webhook/0.log" Oct 01 13:25:33 crc kubenswrapper[4727]: I1001 13:25:33.291673 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:25:33 crc kubenswrapper[4727]: I1001 13:25:33.292479 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:25:41 crc kubenswrapper[4727]: I1001 13:25:41.661221 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-wbtm2_e5813fa0-c580-4ab9-8118-b8dc8ff39470/nmstate-console-plugin/0.log" Oct 01 13:25:41 crc kubenswrapper[4727]: I1001 13:25:41.857181 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-bzph7_307c2e1d-d5d0-4f21-a708-0a43cb624fff/nmstate-handler/0.log" Oct 01 13:25:41 crc kubenswrapper[4727]: I1001 13:25:41.927851 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-nclq7_a429079b-262e-4f8b-9fc8-4dc0ad068fd5/nmstate-metrics/0.log" Oct 01 13:25:41 crc kubenswrapper[4727]: I1001 13:25:41.948749 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-nclq7_a429079b-262e-4f8b-9fc8-4dc0ad068fd5/kube-rbac-proxy/0.log" Oct 01 13:25:42 crc kubenswrapper[4727]: I1001 13:25:42.133520 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-gpd58_872ad5b0-13ab-4c3f-aa66-4a2b7f8ca2b2/nmstate-operator/0.log" Oct 01 13:25:42 crc kubenswrapper[4727]: I1001 13:25:42.176829 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-7vwzd_724b6b3d-215d-4ccc-a966-fc58f517a29f/nmstate-webhook/0.log" Oct 01 13:25:56 crc kubenswrapper[4727]: I1001 13:25:56.093972 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-w2n4b_4f3c3856-d38f-49ec-8930-10cd2f8f2b61/kube-rbac-proxy/0.log" Oct 01 13:25:56 crc kubenswrapper[4727]: I1001 13:25:56.222770 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-w2n4b_4f3c3856-d38f-49ec-8930-10cd2f8f2b61/controller/0.log" Oct 01 13:25:56 crc kubenswrapper[4727]: I1001 13:25:56.377303 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rtkfs_5361bcf2-5da0-41fa-8c27-3507e59217f9/cp-frr-files/0.log" Oct 01 13:25:56 crc kubenswrapper[4727]: I1001 13:25:56.553282 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rtkfs_5361bcf2-5da0-41fa-8c27-3507e59217f9/cp-reloader/0.log" Oct 01 13:25:56 crc kubenswrapper[4727]: I1001 13:25:56.625331 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rtkfs_5361bcf2-5da0-41fa-8c27-3507e59217f9/cp-frr-files/0.log" Oct 01 13:25:56 crc kubenswrapper[4727]: I1001 13:25:56.647248 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rtkfs_5361bcf2-5da0-41fa-8c27-3507e59217f9/cp-reloader/0.log" Oct 01 13:25:56 crc kubenswrapper[4727]: I1001 13:25:56.650365 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rtkfs_5361bcf2-5da0-41fa-8c27-3507e59217f9/cp-metrics/0.log" Oct 01 13:25:56 crc kubenswrapper[4727]: I1001 13:25:56.839235 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rtkfs_5361bcf2-5da0-41fa-8c27-3507e59217f9/cp-frr-files/0.log" Oct 01 13:25:56 crc kubenswrapper[4727]: I1001 13:25:56.839472 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rtkfs_5361bcf2-5da0-41fa-8c27-3507e59217f9/cp-reloader/0.log" Oct 01 13:25:56 crc kubenswrapper[4727]: I1001 13:25:56.873682 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rtkfs_5361bcf2-5da0-41fa-8c27-3507e59217f9/cp-metrics/0.log" Oct 01 13:25:56 crc kubenswrapper[4727]: I1001 13:25:56.876591 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rtkfs_5361bcf2-5da0-41fa-8c27-3507e59217f9/cp-metrics/0.log" Oct 01 13:25:57 crc kubenswrapper[4727]: I1001 13:25:57.145102 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rtkfs_5361bcf2-5da0-41fa-8c27-3507e59217f9/cp-reloader/0.log" Oct 01 13:25:57 crc kubenswrapper[4727]: I1001 13:25:57.163272 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rtkfs_5361bcf2-5da0-41fa-8c27-3507e59217f9/cp-frr-files/0.log" Oct 01 13:25:57 crc kubenswrapper[4727]: I1001 13:25:57.194529 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rtkfs_5361bcf2-5da0-41fa-8c27-3507e59217f9/cp-metrics/0.log" Oct 01 13:25:57 crc kubenswrapper[4727]: I1001 13:25:57.199466 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rtkfs_5361bcf2-5da0-41fa-8c27-3507e59217f9/controller/0.log" Oct 01 13:25:57 crc kubenswrapper[4727]: I1001 13:25:57.446600 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rtkfs_5361bcf2-5da0-41fa-8c27-3507e59217f9/frr-metrics/0.log" Oct 01 13:25:57 crc kubenswrapper[4727]: I1001 13:25:57.447137 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rtkfs_5361bcf2-5da0-41fa-8c27-3507e59217f9/kube-rbac-proxy/0.log" Oct 01 13:25:57 crc kubenswrapper[4727]: I1001 13:25:57.470805 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rtkfs_5361bcf2-5da0-41fa-8c27-3507e59217f9/kube-rbac-proxy-frr/0.log" Oct 01 13:25:57 crc kubenswrapper[4727]: I1001 13:25:57.718938 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rtkfs_5361bcf2-5da0-41fa-8c27-3507e59217f9/reloader/0.log" Oct 01 13:25:57 crc kubenswrapper[4727]: I1001 13:25:57.747760 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-4nbzc_84740a24-d4ce-4fea-83c6-c79d664ee07b/frr-k8s-webhook-server/0.log" Oct 01 13:25:57 crc kubenswrapper[4727]: I1001 13:25:57.981984 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-78766f874-pgvrs_158e3a7a-113f-4004-9c41-9676efc0e93a/manager/0.log" Oct 01 13:25:58 crc kubenswrapper[4727]: I1001 13:25:58.174961 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5c98f7df7d-57v2c_4871f70f-30b9-4353-a083-6c6913107fa1/webhook-server/0.log" Oct 01 13:25:58 crc kubenswrapper[4727]: I1001 13:25:58.369332 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-p277d_63cffc61-9887-47ef-85a1-7c6705e44845/kube-rbac-proxy/0.log" Oct 01 13:25:59 crc kubenswrapper[4727]: I1001 13:25:59.069560 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-p277d_63cffc61-9887-47ef-85a1-7c6705e44845/speaker/0.log" Oct 01 13:25:59 crc kubenswrapper[4727]: I1001 13:25:59.117950 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rtkfs_5361bcf2-5da0-41fa-8c27-3507e59217f9/frr/0.log" Oct 01 13:26:03 crc kubenswrapper[4727]: I1001 13:26:03.292447 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:26:03 crc kubenswrapper[4727]: I1001 13:26:03.293637 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:26:11 crc kubenswrapper[4727]: I1001 13:26:11.833027 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q_684a10a2-03a1-405b-991c-a8aa282ac6ef/util/0.log" Oct 01 13:26:11 crc kubenswrapper[4727]: I1001 13:26:11.996309 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q_684a10a2-03a1-405b-991c-a8aa282ac6ef/util/0.log" Oct 01 13:26:12 crc kubenswrapper[4727]: I1001 13:26:12.056731 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q_684a10a2-03a1-405b-991c-a8aa282ac6ef/pull/0.log" Oct 01 13:26:12 crc kubenswrapper[4727]: I1001 13:26:12.074725 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q_684a10a2-03a1-405b-991c-a8aa282ac6ef/pull/0.log" Oct 01 13:26:12 crc kubenswrapper[4727]: I1001 13:26:12.284964 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q_684a10a2-03a1-405b-991c-a8aa282ac6ef/util/0.log" Oct 01 13:26:12 crc kubenswrapper[4727]: I1001 13:26:12.286831 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q_684a10a2-03a1-405b-991c-a8aa282ac6ef/extract/0.log" Oct 01 13:26:12 crc kubenswrapper[4727]: I1001 13:26:12.286924 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcvsh7q_684a10a2-03a1-405b-991c-a8aa282ac6ef/pull/0.log" Oct 01 13:26:12 crc kubenswrapper[4727]: I1001 13:26:12.487130 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s4cld_cd4c2104-54e9-42be-8a78-b4674c3e7b7c/extract-utilities/0.log" Oct 01 13:26:12 crc kubenswrapper[4727]: I1001 13:26:12.714049 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s4cld_cd4c2104-54e9-42be-8a78-b4674c3e7b7c/extract-utilities/0.log" Oct 01 13:26:12 crc kubenswrapper[4727]: I1001 13:26:12.765259 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s4cld_cd4c2104-54e9-42be-8a78-b4674c3e7b7c/extract-content/0.log" Oct 01 13:26:12 crc kubenswrapper[4727]: I1001 13:26:12.799380 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s4cld_cd4c2104-54e9-42be-8a78-b4674c3e7b7c/extract-content/0.log" Oct 01 13:26:12 crc kubenswrapper[4727]: I1001 13:26:12.948702 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s4cld_cd4c2104-54e9-42be-8a78-b4674c3e7b7c/extract-utilities/0.log" Oct 01 13:26:12 crc kubenswrapper[4727]: I1001 13:26:12.996394 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s4cld_cd4c2104-54e9-42be-8a78-b4674c3e7b7c/extract-content/0.log" Oct 01 13:26:13 crc kubenswrapper[4727]: I1001 13:26:13.222727 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xq5nj_9d2b91c6-4b36-4774-b2ca-59e9d5757b15/extract-utilities/0.log" Oct 01 13:26:13 crc kubenswrapper[4727]: I1001 13:26:13.520311 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xq5nj_9d2b91c6-4b36-4774-b2ca-59e9d5757b15/extract-content/0.log" Oct 01 13:26:13 crc kubenswrapper[4727]: I1001 13:26:13.543646 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xq5nj_9d2b91c6-4b36-4774-b2ca-59e9d5757b15/extract-content/0.log" Oct 01 13:26:13 crc kubenswrapper[4727]: I1001 13:26:13.544208 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s4cld_cd4c2104-54e9-42be-8a78-b4674c3e7b7c/registry-server/0.log" Oct 01 13:26:13 crc kubenswrapper[4727]: I1001 13:26:13.582888 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xq5nj_9d2b91c6-4b36-4774-b2ca-59e9d5757b15/extract-utilities/0.log" Oct 01 13:26:13 crc kubenswrapper[4727]: I1001 13:26:13.772071 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xq5nj_9d2b91c6-4b36-4774-b2ca-59e9d5757b15/extract-utilities/0.log" Oct 01 13:26:13 crc kubenswrapper[4727]: I1001 13:26:13.793434 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xq5nj_9d2b91c6-4b36-4774-b2ca-59e9d5757b15/extract-content/0.log" Oct 01 13:26:14 crc kubenswrapper[4727]: I1001 13:26:14.098538 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw_3e0c2a76-78ea-4d0c-bfd4-541f729a20a0/util/0.log" Oct 01 13:26:14 crc kubenswrapper[4727]: I1001 13:26:14.291143 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw_3e0c2a76-78ea-4d0c-bfd4-541f729a20a0/util/0.log" Oct 01 13:26:14 crc kubenswrapper[4727]: I1001 13:26:14.433920 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw_3e0c2a76-78ea-4d0c-bfd4-541f729a20a0/pull/0.log" Oct 01 13:26:14 crc kubenswrapper[4727]: I1001 13:26:14.437082 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw_3e0c2a76-78ea-4d0c-bfd4-541f729a20a0/pull/0.log" Oct 01 13:26:14 crc kubenswrapper[4727]: I1001 13:26:14.527445 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xq5nj_9d2b91c6-4b36-4774-b2ca-59e9d5757b15/registry-server/0.log" Oct 01 13:26:14 crc kubenswrapper[4727]: I1001 13:26:14.713501 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw_3e0c2a76-78ea-4d0c-bfd4-541f729a20a0/util/0.log" Oct 01 13:26:14 crc kubenswrapper[4727]: I1001 13:26:14.727374 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw_3e0c2a76-78ea-4d0c-bfd4-541f729a20a0/extract/0.log" Oct 01 13:26:14 crc kubenswrapper[4727]: I1001 13:26:14.727853 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96rw2jw_3e0c2a76-78ea-4d0c-bfd4-541f729a20a0/pull/0.log" Oct 01 13:26:14 crc kubenswrapper[4727]: I1001 13:26:14.950180 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-8jhtf_f2bd8192-96d6-40cd-877f-3a288140a8e9/marketplace-operator/0.log" Oct 01 13:26:15 crc kubenswrapper[4727]: I1001 13:26:15.018542 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dk8rp_b24a2466-b1b7-4da7-bc8a-03d9add0de40/extract-utilities/0.log" Oct 01 13:26:15 crc kubenswrapper[4727]: I1001 13:26:15.217447 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dk8rp_b24a2466-b1b7-4da7-bc8a-03d9add0de40/extract-utilities/0.log" Oct 01 13:26:15 crc kubenswrapper[4727]: I1001 13:26:15.269127 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dk8rp_b24a2466-b1b7-4da7-bc8a-03d9add0de40/extract-content/0.log" Oct 01 13:26:15 crc kubenswrapper[4727]: I1001 13:26:15.271288 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dk8rp_b24a2466-b1b7-4da7-bc8a-03d9add0de40/extract-content/0.log" Oct 01 13:26:15 crc kubenswrapper[4727]: I1001 13:26:15.458570 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dk8rp_b24a2466-b1b7-4da7-bc8a-03d9add0de40/extract-utilities/0.log" Oct 01 13:26:15 crc kubenswrapper[4727]: I1001 13:26:15.494988 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dk8rp_b24a2466-b1b7-4da7-bc8a-03d9add0de40/extract-content/0.log" Oct 01 13:26:15 crc kubenswrapper[4727]: I1001 13:26:15.679399 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dk8rp_b24a2466-b1b7-4da7-bc8a-03d9add0de40/registry-server/0.log" Oct 01 13:26:15 crc kubenswrapper[4727]: I1001 13:26:15.738588 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k7cvr_44d99ed4-f0f6-4597-a224-941f817df121/extract-utilities/0.log" Oct 01 13:26:15 crc kubenswrapper[4727]: I1001 13:26:15.954041 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k7cvr_44d99ed4-f0f6-4597-a224-941f817df121/extract-content/0.log" Oct 01 13:26:15 crc kubenswrapper[4727]: I1001 13:26:15.971142 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k7cvr_44d99ed4-f0f6-4597-a224-941f817df121/extract-utilities/0.log" Oct 01 13:26:15 crc kubenswrapper[4727]: I1001 13:26:15.972986 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k7cvr_44d99ed4-f0f6-4597-a224-941f817df121/extract-content/0.log" Oct 01 13:26:16 crc kubenswrapper[4727]: I1001 13:26:16.151903 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k7cvr_44d99ed4-f0f6-4597-a224-941f817df121/extract-content/0.log" Oct 01 13:26:16 crc kubenswrapper[4727]: I1001 13:26:16.158968 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k7cvr_44d99ed4-f0f6-4597-a224-941f817df121/extract-utilities/0.log" Oct 01 13:26:16 crc kubenswrapper[4727]: I1001 13:26:16.453278 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k7cvr_44d99ed4-f0f6-4597-a224-941f817df121/registry-server/0.log" Oct 01 13:26:33 crc kubenswrapper[4727]: I1001 13:26:33.291649 4727 patch_prober.go:28] interesting pod/machine-config-daemon-c7jw9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:26:33 crc kubenswrapper[4727]: I1001 13:26:33.292364 4727 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:26:33 crc kubenswrapper[4727]: I1001 13:26:33.292424 4727 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" Oct 01 13:26:33 crc kubenswrapper[4727]: I1001 13:26:33.293337 4727 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"21070767e1e9adb1fdd93b30f0f7a9f9798d3c489a6476ac64ea84ad6be77128"} pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:26:33 crc kubenswrapper[4727]: I1001 13:26:33.293408 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" containerName="machine-config-daemon" containerID="cri-o://21070767e1e9adb1fdd93b30f0f7a9f9798d3c489a6476ac64ea84ad6be77128" gracePeriod=600 Oct 01 13:26:33 crc kubenswrapper[4727]: E1001 13:26:33.452593 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:26:34 crc kubenswrapper[4727]: I1001 13:26:34.342122 4727 generic.go:334] "Generic (PLEG): container finished" podID="d18290ae-64a5-44a5-a704-90977d85852b" containerID="21070767e1e9adb1fdd93b30f0f7a9f9798d3c489a6476ac64ea84ad6be77128" exitCode=0 Oct 01 13:26:34 crc kubenswrapper[4727]: I1001 13:26:34.342198 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" event={"ID":"d18290ae-64a5-44a5-a704-90977d85852b","Type":"ContainerDied","Data":"21070767e1e9adb1fdd93b30f0f7a9f9798d3c489a6476ac64ea84ad6be77128"} Oct 01 13:26:34 crc kubenswrapper[4727]: I1001 13:26:34.342263 4727 scope.go:117] "RemoveContainer" containerID="0b20f9df355cf8a1786e71a3d4bf9a8db762df0e7ec9fae3b46c91317a229a05" Oct 01 13:26:34 crc kubenswrapper[4727]: I1001 13:26:34.343501 4727 scope.go:117] "RemoveContainer" containerID="21070767e1e9adb1fdd93b30f0f7a9f9798d3c489a6476ac64ea84ad6be77128" Oct 01 13:26:34 crc kubenswrapper[4727]: E1001 13:26:34.343932 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:26:42 crc kubenswrapper[4727]: E1001 13:26:42.982195 4727 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.69:56518->38.102.83.69:41793: write tcp 38.102.83.69:56518->38.102.83.69:41793: write: broken pipe Oct 01 13:26:47 crc kubenswrapper[4727]: I1001 13:26:47.375196 4727 scope.go:117] "RemoveContainer" containerID="21070767e1e9adb1fdd93b30f0f7a9f9798d3c489a6476ac64ea84ad6be77128" Oct 01 13:26:47 crc kubenswrapper[4727]: E1001 13:26:47.376136 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:27:01 crc kubenswrapper[4727]: I1001 13:27:01.372886 4727 scope.go:117] "RemoveContainer" containerID="21070767e1e9adb1fdd93b30f0f7a9f9798d3c489a6476ac64ea84ad6be77128" Oct 01 13:27:01 crc kubenswrapper[4727]: E1001 13:27:01.373917 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:27:14 crc kubenswrapper[4727]: I1001 13:27:14.373752 4727 scope.go:117] "RemoveContainer" containerID="21070767e1e9adb1fdd93b30f0f7a9f9798d3c489a6476ac64ea84ad6be77128" Oct 01 13:27:14 crc kubenswrapper[4727]: E1001 13:27:14.375211 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:27:29 crc kubenswrapper[4727]: I1001 13:27:29.380575 4727 scope.go:117] "RemoveContainer" containerID="21070767e1e9adb1fdd93b30f0f7a9f9798d3c489a6476ac64ea84ad6be77128" Oct 01 13:27:29 crc kubenswrapper[4727]: E1001 13:27:29.381341 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:27:41 crc kubenswrapper[4727]: I1001 13:27:41.372373 4727 scope.go:117] "RemoveContainer" containerID="21070767e1e9adb1fdd93b30f0f7a9f9798d3c489a6476ac64ea84ad6be77128" Oct 01 13:27:41 crc kubenswrapper[4727]: E1001 13:27:41.373299 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:27:54 crc kubenswrapper[4727]: I1001 13:27:54.382344 4727 scope.go:117] "RemoveContainer" containerID="21070767e1e9adb1fdd93b30f0f7a9f9798d3c489a6476ac64ea84ad6be77128" Oct 01 13:27:54 crc kubenswrapper[4727]: E1001 13:27:54.387822 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:28:08 crc kubenswrapper[4727]: I1001 13:28:08.372717 4727 scope.go:117] "RemoveContainer" containerID="21070767e1e9adb1fdd93b30f0f7a9f9798d3c489a6476ac64ea84ad6be77128" Oct 01 13:28:08 crc kubenswrapper[4727]: E1001 13:28:08.373538 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:28:18 crc kubenswrapper[4727]: I1001 13:28:18.410784 4727 generic.go:334] "Generic (PLEG): container finished" podID="479fa617-cf9f-4bf7-9290-5833831b934b" containerID="dbd39415acaebb0aabbc33cbbc19629236ef70f90fe2c1943ea4e16d007cd5eb" exitCode=0 Oct 01 13:28:18 crc kubenswrapper[4727]: I1001 13:28:18.412435 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6wxt/must-gather-2jrvc" event={"ID":"479fa617-cf9f-4bf7-9290-5833831b934b","Type":"ContainerDied","Data":"dbd39415acaebb0aabbc33cbbc19629236ef70f90fe2c1943ea4e16d007cd5eb"} Oct 01 13:28:18 crc kubenswrapper[4727]: I1001 13:28:18.413886 4727 scope.go:117] "RemoveContainer" containerID="dbd39415acaebb0aabbc33cbbc19629236ef70f90fe2c1943ea4e16d007cd5eb" Oct 01 13:28:19 crc kubenswrapper[4727]: I1001 13:28:19.049412 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d6wxt_must-gather-2jrvc_479fa617-cf9f-4bf7-9290-5833831b934b/gather/0.log" Oct 01 13:28:22 crc kubenswrapper[4727]: I1001 13:28:22.381588 4727 scope.go:117] "RemoveContainer" containerID="21070767e1e9adb1fdd93b30f0f7a9f9798d3c489a6476ac64ea84ad6be77128" Oct 01 13:28:22 crc kubenswrapper[4727]: E1001 13:28:22.382232 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:28:27 crc kubenswrapper[4727]: I1001 13:28:27.111895 4727 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d6wxt/must-gather-2jrvc"] Oct 01 13:28:27 crc kubenswrapper[4727]: I1001 13:28:27.112876 4727 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-d6wxt/must-gather-2jrvc" podUID="479fa617-cf9f-4bf7-9290-5833831b934b" containerName="copy" containerID="cri-o://e6c580c297dbab7a6d8786dc91a12f7272db08e8312eb4814f7107daab6e0171" gracePeriod=2 Oct 01 13:28:27 crc kubenswrapper[4727]: I1001 13:28:27.125574 4727 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d6wxt/must-gather-2jrvc"] Oct 01 13:28:27 crc kubenswrapper[4727]: I1001 13:28:27.533069 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d6wxt_must-gather-2jrvc_479fa617-cf9f-4bf7-9290-5833831b934b/copy/0.log" Oct 01 13:28:27 crc kubenswrapper[4727]: I1001 13:28:27.533251 4727 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d6wxt_must-gather-2jrvc_479fa617-cf9f-4bf7-9290-5833831b934b/copy/0.log" Oct 01 13:28:27 crc kubenswrapper[4727]: I1001 13:28:27.533527 4727 generic.go:334] "Generic (PLEG): container finished" podID="479fa617-cf9f-4bf7-9290-5833831b934b" containerID="e6c580c297dbab7a6d8786dc91a12f7272db08e8312eb4814f7107daab6e0171" exitCode=143 Oct 01 13:28:27 crc kubenswrapper[4727]: I1001 13:28:27.533584 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b992e6f322a06d006a0f44e55f009160d540029fb50361590b79b075f09bd134" Oct 01 13:28:27 crc kubenswrapper[4727]: I1001 13:28:27.533639 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6wxt/must-gather-2jrvc" Oct 01 13:28:27 crc kubenswrapper[4727]: I1001 13:28:27.648114 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhndb\" (UniqueName: \"kubernetes.io/projected/479fa617-cf9f-4bf7-9290-5833831b934b-kube-api-access-rhndb\") pod \"479fa617-cf9f-4bf7-9290-5833831b934b\" (UID: \"479fa617-cf9f-4bf7-9290-5833831b934b\") " Oct 01 13:28:27 crc kubenswrapper[4727]: I1001 13:28:27.648526 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/479fa617-cf9f-4bf7-9290-5833831b934b-must-gather-output\") pod \"479fa617-cf9f-4bf7-9290-5833831b934b\" (UID: \"479fa617-cf9f-4bf7-9290-5833831b934b\") " Oct 01 13:28:27 crc kubenswrapper[4727]: I1001 13:28:27.661348 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/479fa617-cf9f-4bf7-9290-5833831b934b-kube-api-access-rhndb" (OuterVolumeSpecName: "kube-api-access-rhndb") pod "479fa617-cf9f-4bf7-9290-5833831b934b" (UID: "479fa617-cf9f-4bf7-9290-5833831b934b"). InnerVolumeSpecName "kube-api-access-rhndb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:28:27 crc kubenswrapper[4727]: I1001 13:28:27.763548 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhndb\" (UniqueName: \"kubernetes.io/projected/479fa617-cf9f-4bf7-9290-5833831b934b-kube-api-access-rhndb\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:27 crc kubenswrapper[4727]: I1001 13:28:27.860936 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/479fa617-cf9f-4bf7-9290-5833831b934b-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "479fa617-cf9f-4bf7-9290-5833831b934b" (UID: "479fa617-cf9f-4bf7-9290-5833831b934b"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:28:27 crc kubenswrapper[4727]: I1001 13:28:27.865445 4727 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/479fa617-cf9f-4bf7-9290-5833831b934b-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:28 crc kubenswrapper[4727]: I1001 13:28:28.387435 4727 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="479fa617-cf9f-4bf7-9290-5833831b934b" path="/var/lib/kubelet/pods/479fa617-cf9f-4bf7-9290-5833831b934b/volumes" Oct 01 13:28:28 crc kubenswrapper[4727]: I1001 13:28:28.541365 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6wxt/must-gather-2jrvc" Oct 01 13:28:33 crc kubenswrapper[4727]: I1001 13:28:33.373164 4727 scope.go:117] "RemoveContainer" containerID="21070767e1e9adb1fdd93b30f0f7a9f9798d3c489a6476ac64ea84ad6be77128" Oct 01 13:28:33 crc kubenswrapper[4727]: E1001 13:28:33.374036 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:28:45 crc kubenswrapper[4727]: I1001 13:28:45.372566 4727 scope.go:117] "RemoveContainer" containerID="21070767e1e9adb1fdd93b30f0f7a9f9798d3c489a6476ac64ea84ad6be77128" Oct 01 13:28:45 crc kubenswrapper[4727]: E1001 13:28:45.373431 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:29:00 crc kubenswrapper[4727]: I1001 13:29:00.373537 4727 scope.go:117] "RemoveContainer" containerID="21070767e1e9adb1fdd93b30f0f7a9f9798d3c489a6476ac64ea84ad6be77128" Oct 01 13:29:00 crc kubenswrapper[4727]: E1001 13:29:00.374905 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:29:11 crc kubenswrapper[4727]: I1001 13:29:11.373042 4727 scope.go:117] "RemoveContainer" containerID="21070767e1e9adb1fdd93b30f0f7a9f9798d3c489a6476ac64ea84ad6be77128" Oct 01 13:29:11 crc kubenswrapper[4727]: E1001 13:29:11.374134 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:29:12 crc kubenswrapper[4727]: I1001 13:29:12.198945 4727 scope.go:117] "RemoveContainer" containerID="e6c580c297dbab7a6d8786dc91a12f7272db08e8312eb4814f7107daab6e0171" Oct 01 13:29:12 crc kubenswrapper[4727]: I1001 13:29:12.229821 4727 scope.go:117] "RemoveContainer" containerID="dbd39415acaebb0aabbc33cbbc19629236ef70f90fe2c1943ea4e16d007cd5eb" Oct 01 13:29:26 crc kubenswrapper[4727]: I1001 13:29:26.372716 4727 scope.go:117] "RemoveContainer" containerID="21070767e1e9adb1fdd93b30f0f7a9f9798d3c489a6476ac64ea84ad6be77128" Oct 01 13:29:26 crc kubenswrapper[4727]: E1001 13:29:26.373640 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:29:39 crc kubenswrapper[4727]: I1001 13:29:39.372967 4727 scope.go:117] "RemoveContainer" containerID="21070767e1e9adb1fdd93b30f0f7a9f9798d3c489a6476ac64ea84ad6be77128" Oct 01 13:29:39 crc kubenswrapper[4727]: E1001 13:29:39.374316 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:29:52 crc kubenswrapper[4727]: I1001 13:29:52.378831 4727 scope.go:117] "RemoveContainer" containerID="21070767e1e9adb1fdd93b30f0f7a9f9798d3c489a6476ac64ea84ad6be77128" Oct 01 13:29:52 crc kubenswrapper[4727]: E1001 13:29:52.379805 4727 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c7jw9_openshift-machine-config-operator(d18290ae-64a5-44a5-a704-90977d85852b)\"" pod="openshift-machine-config-operator/machine-config-daemon-c7jw9" podUID="d18290ae-64a5-44a5-a704-90977d85852b" Oct 01 13:30:00 crc kubenswrapper[4727]: I1001 13:30:00.183758 4727 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322090-zjq86"] Oct 01 13:30:00 crc kubenswrapper[4727]: E1001 13:30:00.185385 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="479fa617-cf9f-4bf7-9290-5833831b934b" containerName="copy" Oct 01 13:30:00 crc kubenswrapper[4727]: I1001 13:30:00.185415 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="479fa617-cf9f-4bf7-9290-5833831b934b" containerName="copy" Oct 01 13:30:00 crc kubenswrapper[4727]: E1001 13:30:00.185449 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44876bc2-f5ce-41d6-b972-360d61acc6fb" containerName="container-00" Oct 01 13:30:00 crc kubenswrapper[4727]: I1001 13:30:00.185458 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="44876bc2-f5ce-41d6-b972-360d61acc6fb" containerName="container-00" Oct 01 13:30:00 crc kubenswrapper[4727]: E1001 13:30:00.185488 4727 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="479fa617-cf9f-4bf7-9290-5833831b934b" containerName="gather" Oct 01 13:30:00 crc kubenswrapper[4727]: I1001 13:30:00.185497 4727 state_mem.go:107] "Deleted CPUSet assignment" podUID="479fa617-cf9f-4bf7-9290-5833831b934b" containerName="gather" Oct 01 13:30:00 crc kubenswrapper[4727]: I1001 13:30:00.185748 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="44876bc2-f5ce-41d6-b972-360d61acc6fb" containerName="container-00" Oct 01 13:30:00 crc kubenswrapper[4727]: I1001 13:30:00.185797 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="479fa617-cf9f-4bf7-9290-5833831b934b" containerName="copy" Oct 01 13:30:00 crc kubenswrapper[4727]: I1001 13:30:00.185818 4727 memory_manager.go:354] "RemoveStaleState removing state" podUID="479fa617-cf9f-4bf7-9290-5833831b934b" containerName="gather" Oct 01 13:30:00 crc kubenswrapper[4727]: I1001 13:30:00.186944 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-zjq86" Oct 01 13:30:00 crc kubenswrapper[4727]: I1001 13:30:00.190717 4727 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 13:30:00 crc kubenswrapper[4727]: I1001 13:30:00.191130 4727 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 13:30:00 crc kubenswrapper[4727]: I1001 13:30:00.199115 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322090-zjq86"] Oct 01 13:30:00 crc kubenswrapper[4727]: I1001 13:30:00.312814 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dacc6da2-56a0-46bf-8ab1-114b8813b6a1-config-volume\") pod \"collect-profiles-29322090-zjq86\" (UID: \"dacc6da2-56a0-46bf-8ab1-114b8813b6a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-zjq86" Oct 01 13:30:00 crc kubenswrapper[4727]: I1001 13:30:00.312897 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dacc6da2-56a0-46bf-8ab1-114b8813b6a1-secret-volume\") pod \"collect-profiles-29322090-zjq86\" (UID: \"dacc6da2-56a0-46bf-8ab1-114b8813b6a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-zjq86" Oct 01 13:30:00 crc kubenswrapper[4727]: I1001 13:30:00.313484 4727 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkpjz\" (UniqueName: \"kubernetes.io/projected/dacc6da2-56a0-46bf-8ab1-114b8813b6a1-kube-api-access-gkpjz\") pod \"collect-profiles-29322090-zjq86\" (UID: \"dacc6da2-56a0-46bf-8ab1-114b8813b6a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-zjq86" Oct 01 13:30:00 crc kubenswrapper[4727]: I1001 13:30:00.415208 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dacc6da2-56a0-46bf-8ab1-114b8813b6a1-config-volume\") pod \"collect-profiles-29322090-zjq86\" (UID: \"dacc6da2-56a0-46bf-8ab1-114b8813b6a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-zjq86" Oct 01 13:30:00 crc kubenswrapper[4727]: I1001 13:30:00.415313 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dacc6da2-56a0-46bf-8ab1-114b8813b6a1-secret-volume\") pod \"collect-profiles-29322090-zjq86\" (UID: \"dacc6da2-56a0-46bf-8ab1-114b8813b6a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-zjq86" Oct 01 13:30:00 crc kubenswrapper[4727]: I1001 13:30:00.415397 4727 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkpjz\" (UniqueName: \"kubernetes.io/projected/dacc6da2-56a0-46bf-8ab1-114b8813b6a1-kube-api-access-gkpjz\") pod \"collect-profiles-29322090-zjq86\" (UID: \"dacc6da2-56a0-46bf-8ab1-114b8813b6a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-zjq86" Oct 01 13:30:00 crc kubenswrapper[4727]: I1001 13:30:00.416274 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dacc6da2-56a0-46bf-8ab1-114b8813b6a1-config-volume\") pod \"collect-profiles-29322090-zjq86\" (UID: \"dacc6da2-56a0-46bf-8ab1-114b8813b6a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-zjq86" Oct 01 13:30:00 crc kubenswrapper[4727]: I1001 13:30:00.423446 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dacc6da2-56a0-46bf-8ab1-114b8813b6a1-secret-volume\") pod \"collect-profiles-29322090-zjq86\" (UID: \"dacc6da2-56a0-46bf-8ab1-114b8813b6a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-zjq86" Oct 01 13:30:00 crc kubenswrapper[4727]: I1001 13:30:00.439614 4727 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkpjz\" (UniqueName: \"kubernetes.io/projected/dacc6da2-56a0-46bf-8ab1-114b8813b6a1-kube-api-access-gkpjz\") pod \"collect-profiles-29322090-zjq86\" (UID: \"dacc6da2-56a0-46bf-8ab1-114b8813b6a1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-zjq86" Oct 01 13:30:00 crc kubenswrapper[4727]: I1001 13:30:00.516079 4727 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-zjq86" Oct 01 13:30:00 crc kubenswrapper[4727]: I1001 13:30:00.977830 4727 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322090-zjq86"] Oct 01 13:30:01 crc kubenswrapper[4727]: I1001 13:30:01.428624 4727 generic.go:334] "Generic (PLEG): container finished" podID="dacc6da2-56a0-46bf-8ab1-114b8813b6a1" containerID="dacbd8b486ea00762e5e5e5b34aadee53e2ef775567ac616b92a4057e1308f34" exitCode=0 Oct 01 13:30:01 crc kubenswrapper[4727]: I1001 13:30:01.429316 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-zjq86" event={"ID":"dacc6da2-56a0-46bf-8ab1-114b8813b6a1","Type":"ContainerDied","Data":"dacbd8b486ea00762e5e5e5b34aadee53e2ef775567ac616b92a4057e1308f34"} Oct 01 13:30:01 crc kubenswrapper[4727]: I1001 13:30:01.429400 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-zjq86" event={"ID":"dacc6da2-56a0-46bf-8ab1-114b8813b6a1","Type":"ContainerStarted","Data":"f8d8fe3298985e58d4b8aeb4f36da03b9646690a0e4d598f65c06ec75c9dbe10"} Oct 01 13:30:02 crc kubenswrapper[4727]: I1001 13:30:02.723053 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-zjq86" Oct 01 13:30:02 crc kubenswrapper[4727]: I1001 13:30:02.859209 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkpjz\" (UniqueName: \"kubernetes.io/projected/dacc6da2-56a0-46bf-8ab1-114b8813b6a1-kube-api-access-gkpjz\") pod \"dacc6da2-56a0-46bf-8ab1-114b8813b6a1\" (UID: \"dacc6da2-56a0-46bf-8ab1-114b8813b6a1\") " Oct 01 13:30:02 crc kubenswrapper[4727]: I1001 13:30:02.859550 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dacc6da2-56a0-46bf-8ab1-114b8813b6a1-config-volume\") pod \"dacc6da2-56a0-46bf-8ab1-114b8813b6a1\" (UID: \"dacc6da2-56a0-46bf-8ab1-114b8813b6a1\") " Oct 01 13:30:02 crc kubenswrapper[4727]: I1001 13:30:02.859690 4727 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dacc6da2-56a0-46bf-8ab1-114b8813b6a1-secret-volume\") pod \"dacc6da2-56a0-46bf-8ab1-114b8813b6a1\" (UID: \"dacc6da2-56a0-46bf-8ab1-114b8813b6a1\") " Oct 01 13:30:02 crc kubenswrapper[4727]: I1001 13:30:02.860142 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dacc6da2-56a0-46bf-8ab1-114b8813b6a1-config-volume" (OuterVolumeSpecName: "config-volume") pod "dacc6da2-56a0-46bf-8ab1-114b8813b6a1" (UID: "dacc6da2-56a0-46bf-8ab1-114b8813b6a1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:30:02 crc kubenswrapper[4727]: I1001 13:30:02.860550 4727 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dacc6da2-56a0-46bf-8ab1-114b8813b6a1-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:02 crc kubenswrapper[4727]: I1001 13:30:02.865401 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dacc6da2-56a0-46bf-8ab1-114b8813b6a1-kube-api-access-gkpjz" (OuterVolumeSpecName: "kube-api-access-gkpjz") pod "dacc6da2-56a0-46bf-8ab1-114b8813b6a1" (UID: "dacc6da2-56a0-46bf-8ab1-114b8813b6a1"). InnerVolumeSpecName "kube-api-access-gkpjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:30:02 crc kubenswrapper[4727]: I1001 13:30:02.865402 4727 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dacc6da2-56a0-46bf-8ab1-114b8813b6a1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dacc6da2-56a0-46bf-8ab1-114b8813b6a1" (UID: "dacc6da2-56a0-46bf-8ab1-114b8813b6a1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:30:02 crc kubenswrapper[4727]: I1001 13:30:02.961780 4727 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkpjz\" (UniqueName: \"kubernetes.io/projected/dacc6da2-56a0-46bf-8ab1-114b8813b6a1-kube-api-access-gkpjz\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:02 crc kubenswrapper[4727]: I1001 13:30:02.961823 4727 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dacc6da2-56a0-46bf-8ab1-114b8813b6a1-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:03 crc kubenswrapper[4727]: I1001 13:30:03.458653 4727 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-zjq86" event={"ID":"dacc6da2-56a0-46bf-8ab1-114b8813b6a1","Type":"ContainerDied","Data":"f8d8fe3298985e58d4b8aeb4f36da03b9646690a0e4d598f65c06ec75c9dbe10"} Oct 01 13:30:03 crc kubenswrapper[4727]: I1001 13:30:03.459095 4727 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8d8fe3298985e58d4b8aeb4f36da03b9646690a0e4d598f65c06ec75c9dbe10" Oct 01 13:30:03 crc kubenswrapper[4727]: I1001 13:30:03.458704 4727 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-zjq86" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515067226342024454 0ustar coreroot‹íÁ  ÷Om7 €7šÞ'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015067226342017371 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015067217654016522 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015067217654015472 5ustar corecore